Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

Legacy Health IT Systems – So Old They’re Secure

Posted on April 21, 2017 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I’ve been thinking quite a bit about the ticking time bomb that is legacy healthcare IT systems. The topic has been top of mind for me ever since Galen Healthcare Solutions wrote their Tackling EHR & EMR Transition series of blog posts. This is an important topic even if it’s not a sexy one.

I don’t think we need to dive into the details of why legacy healthcare IT systems are a security risk for most healthcare organizations. Hospitals and health systems have hundreds of production systems that they’re trying to keep secure. It’s not hard to see why legacy systems get forgotten. Forgotten systems are ripe for hackers and others that want to do nefarious things.

Although, I did hear someone recently talking about legacy health IT systems who said that they had some technology in their organization that was so old it was secure again. I guess there’s something to say about having systems that are so old that hackers don’t have tools that can breach such old systems or that can read old files. Not to mention that many of these older systems weren’t internet connected.

While I find humor in the idea that something could be so old that it’s secure again, that’s not the reality for most legacy systems. Most old systems can be breached and will be breached if they’re not considered “production” when it comes to patching and securing them.

When you think about the costs of updating and securing your legacy systems like you would a production system for security purposes, it’s easy to see why finding a way to sunset these legacy systems is becoming a popular option. Sure, you have to find a way to maintain the integrity of the data, but the tools to do this have come a long way.

The other reason I like the idea of migrating data from a legacy system and sunsetting the old system is that this often opens the door for users to be able to access the legacy data. When the data is stored on the legacy system it’s generally not used unless it’s absolutely necessary. If you migrate that legacy data to an archival platform, then the data can be used by more people to influence care. That’s a good thing.

Legacy health IT systems are a challenge that isn’t going to go away. In fact, it’s likely to get worse as we transition from one software to the next. Having a strategy for these legacy systems which ensures security, compliance, and extracts value is going to be a key to success for every healthcare organization.

Will Data Aggregation For Precision Medicine Compromise Patient Privacy?

Posted on April 10, 2017 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Like anyone else who follows medical research, I’m fascinated by the progress of precision medicine initiatives. I often find myself explaining to relatives that in the (perhaps far distant) future, their doctor may be able to offer treatments customized specifically for them. The prospect is awe-inspiring even for me, someone who’s been researching and writing about health data for decades.

That being the case, there are problems in bringing so much personal information together into a giant database, suggests Jennifer Kulynych in an article for OUPblog, which is published by Oxford University Press. In particular, bringing together a massive trove of individual medical histories and genomes may have serious privacy implications, she says.

In arguing her point, she makes a sobering observation that rings true for me:

“A growing number of experts, particularly re-identification scientists, believe it simply isn’t possible to de-identify the genomic data and medical information needed for precision medicine. To be useful, such information can’t be modified or stripped of identifiers to the point where there’s no real risk that the data could be linked back to a patient.”

As she points out, norms in the research community make it even more likely that patients could be individually identified. For example, while a doctor might need your permission to test your blood for care, in some states it’s quite legal for a researcher to take possession of blood not needed for that care, she says. Those researchers can then sequence your genome and place that data in a research database, and the patient may never have consented to this, or even know that it happened.

And there are other, perhaps even more troubling ways in which existing laws fail to protect the privacy of patients in researchers’ data stores. For example, current research and medical regs let review boards waive patient consent or even allow researchers to call DNA sequences “de-identified” data. This flies in the face of conventional wisdom that there’s no re-identification risk, she writes.

On top of all of this, the technology already exists to leverage this information for personal identification. For example, genome sequences can potentially be re-identified through comparison to a database of identified genomes. Law enforcement organizations have already used such data to predict key aspects of an individual’s face (such as eye color and race) from genomic data.

Then there’s the issue of what happens with EMR data storage. As the author notes, healthcare organizations are increasingly adding genomic data to their stores, and sharing it widely with individuals on their network. While such practices are largely confined to academic research institutions today, this type of data use is growing, and could also expose patients to involuntary identification.

Not everyone is as concerned as Kulynych about these issues. For example, a group of researchers recently concluded that a single patient anonymization algorithm could offer a “standard” level of privacy protection to patient, even when the organizations involved are sharing clinical data. They argue that larger clinical datasets that use this approach could protect patient privacy without generalizing or suppressing data in a manner that would undermine its usefulness.

But if nothing else, it’s hard to argue Kulynych’s central concern, that too few rules have been updated to reflect the realities of big genomic and medical data stories. Clearly, state and federal rules  need to address the emerging problems associated with big data and privacy. Otherwise, by the time a major privacy breach occurs, neither patients nor researchers will have any recourse.

E-Patient Update: Reducing Your Patients’ Security Anxiety

Posted on March 31, 2017 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Even if you’re not a computer-savvy person, these days you can hardly miss the fact that healthcare data is a desirable target for cyber-criminals. After all, over the past few years, healthcare data breaches have been in the news almost every day, with some affecting millions of consumers.

As a result, many patients have become at least a bit afraid of interacting with health data online. Some are afraid that data stored on their doctor or hospital’s server will be compromised, some are afraid to manage their data on their own, and others don’t even know what they’re worried about – but they’re scared to get involved with health data online.

As an e-patient who’s lived online in one form or another since the 80s (anyone remember GEnie or Compuserve?) I’ve probably grown a bit too blasé about security risks. While I guard my online banking password as carefully as anyone else, I don’t tend to worry too much about abstract threats posed by someone who might someday, somehow find my healthcare data among millions of other files.

But I realize that most patients – and providers – take these issues very seriously, and with good reason. Even if HIPAA weren’t the law of the land, providers couldn’t afford to have patients feel like their privacy wasn’t being respected. After all, patients can’t get the highest-quality treatment available if they aren’t comfortable being candid about their health behaviors.

What’s more, no provider wants to have their non-clinical data hacked either. Protecting Social Security numbers, credit card details and other financial data is a critical responsibility, and failing at it could cost patients more than their privacy.

Still, if we manage to intimidate the people we’re trying to help, that can’t be good either. Surely we can protect health data without alienating too many patients.

Striking a balance

I believe it’s important to strike a balance between being serious about security and making it difficult or frightening for patients to engage with their data. While I’m not a security expert, here’s some thoughts on how to strike that balance, from the standpoint of a computer-friendly patient.

  • Don’t overdo things: Following strong security practices is a good idea, but if they’re upsetting or cumbersome they may defeat your larger purposes. I’m reminded of the policy of one of my parents’ providers, who would only provide a new password for their Epic portal if my folks came to the office in person. Wouldn’t a snail mail letter serve, at least if they used registered mail?
  • Use common-sense procedures: By all means, see to it that your patients access their data securely, but work that into your standard registration process and workflow. By the time a patient leaves your office they should have access to everything they need for portal access.
  • Guide patients through changes: In some cases, providers will want to change their security approach, which may mean that patients have to choose a new ID and password or otherwise change their routine. If that’s necessary, send them an email or text message letting them know that these changes are expected. Otherwise they might be worried that the changes represent a threat.
  • Remember patient fears: While practice administrators and IT staff may understand security basics, and why such protections are necessary, patients may not. Bear in mind that if you take a grim tone when discussing security issues, they may be afraid to visit your portal. Keep security explanations professional but pleasant.

Remember your goals

Speaking as a consumer of patient health data, I have to say that many of the health data sites I’ve accessed are a bit tricky to use. (OK, to be honest, many seem to be designed by a committee of 40-something engineers that never saw a gimmicky interface they didn’t like.)

And that isn’t all. Unfortunately, even a highly usable patient data portal or app can become far more difficult to use if necessary security protections are added to the mix. And of course, sometimes that may be how things have to be.

I guess I’m just encouraging providers who read this to remember their long-term goals. Don’t forget that even security measures should be evaluated as part of a patient’s experience, and at least see that they do as little as possible to undercut that experience.

After all, if a girl-geek and e-patient like myself finds the security management aspect of accessing my data to be a bummer, I can only imagine other consumers will just walk away from the keyboard. With any luck, we can find ways to be security-conscious without imposing major barriers to patient engagement.

Wide Ranging Impact of A Healthcare Cybersecurity Attack

Posted on March 8, 2017 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

David Chou recently shared this amazing graphic of the “above the surface” and “beneath the surface” impacts from cyber attacks. The above the surface attacks are those that are better know costs related to an incident. The beneath the surface attacks are the less visible or hidden costs of a cyber attack.

Which of these impacts concerns you most?

If this list of 14 impacts on your organization isn’t enough to wake you up to the importance of cybersecurity, then there isn’t much hope. However, most of the CIOs I’ve seen are well aware of this and it’s why it keeps them up at night.

Costs Of Compromised Credentials Rising

Posted on March 3, 2017 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Healthcare organizations face unique network access challenges. While some industries only need to control access by professional employees and partners, healthcare organizations are increasingly opening up data to consumers, and the number of consumer access points are multiplying. While other industries face similar problems – banking seems particularly relevant – I don’t know of any other industry that depends on such a sophisticated data exchange with consumers to achieve critical results.

Given the industry’s security issues, I found the following article to be quite interesting. While it doesn’t address healthcare concerns directly, I think it’s relevant nonetheless.

The article, written by InfoArmor CTO Christian Lees, contends that next-generation credentials are “edging toward a precarious place.” He argues that because IT workers are under great pressure to produce, they’re rushing the credentialing process. And that has led to a lack of attention to detail, he says:

“Employees, contractors and even vendors are rapidly credentialed with little attention given to security rules such as limiting access per job roles, enforcing secure passwords, and immediately revoking credentials after an employee moves on…[and as a result], criminals get to choose from a smorgasbord of credentialed identities with which to phish employees and even top executives.”

Meanwhile, if auto-generated passwords are short and ineffective, or so long that users must write them down to remember them, credentials tend to get compromised quickly. What’s more, password sharing and security shortcuts used for sign-in (such as storing a password in a browser) pose further risk, he notes.

Though he doesn’t state this in exactly these words, the problem is obviously multiplied when you’re a healthcare provider. After all, if you’re managing not only thousands of employee and partner credentials, but potentially, millions of consumer credentials for use in accessing portal data, you’re fighting a battle on many fronts.

And unfortunately, the cost of losing control of these credentials is very high. In fact, according to a Verizon study, 63% of confirmed data breaches happening last year involved weak, default or stolen passwords.

To tackle this problem, Lees suggests, organizations should create a work process which handles different types of credentials in different ways.

If you’re providing access to public-facing information, which doesn’t include transaction, identifying or sensitive information, using a standard password may be good enough. The passwords should still be encrypted and protected, but they should still be easy to use, he says.

Meanwhile, if you need to offer users access to highly sensitive information, your IT organization should implement a separate process which assigns stronger, more complex passwords as well as security layers like biometrics, cryptographic keys or out-of-band confirmation codes, Lees recommends.

Another way to improve your credentialing strategy is to associate known behaviors with those credentials. “If you know that Bill comes to the office on Tuesdays and Thursdays but works remotely the rest of the week and that he routinely accesses certain types of files, it becomes much harder for a criminal to use Bill’s compromised credentials undetected,” he writes.

Of course, readers of this blog will have their own strategies in placefor protecting credentials, but Lee’s suggestions are worth considering as well. When you’re dealing with valuable health data, it never hurts to go that extra mile. If you don’t, you might get a visit by the HIPAA police (proverbial, not actual).

Whitepaper: Is Windows 10 HIPAA Compliant?

Posted on February 22, 2017 I Written By

The following is a guest blog post by Steven Marco, CISA, ITIL, HP SA and President of HIPAA One®.
Steven Marco - HIPAA expert
HIPAA One has collaborated with Microsoft on a new whitepaper that addresses Windows 10 and HIPAA compliance.

The whitepaper, HIPAA Compliance with Microsoft Windows 10 Enterprise, provides guidance on how to leverage Microsoft Windows 10 as a HIPAA-compliant, baseline operating system for functionality and security. Additionally, the paper tackles head on (and debunks) the myth that Microsoft Windows is not HIPAA compliant.
In light of the recent focus on HIPAA enforcement actions; hospitals, clinics, healthcare clearinghouses and business associates are trying to understand how to manage modern operating systems with cloud features to meet HIPAA regulatory mandates. Along with adhering to HIPAA, many healthcare organizations are under pressure to broadly embrace the benefits of cloud computing and manage the security implications.

Microsoft has invested heavily in security and privacy technologies to address and mitigate today’s threats. Windows 10 Enterprise has been designed to be the most user-friendly Windows yet and includes deep architectural advancements that have changed the game when navigating hacking and malware threats. For this reason, organizations in every industry, including the Pentagon and Department of Defense have upgraded to Windows 10 Enterprise to improve their security posture. However, as with all software upgrades; functionality, security and privacy implications must be understood and addressed.

The intersection between HIPAA compliance and main stream applications can often be confusing to navigate. This industry-leading whitepaper addresses the questions and concerns that are currently top-of-mind for healthcare IT and legal professionals responsible for managing ePHI and maintain HIPAA compliance.

Download your copy today and learn now Microsoft Windows 10 Enterprise enables its users to meet and/or exceed their HIPAA Security and Privacy requirements.

About Steven Marco
Steven Marco is the President of HIPAA One®, leading provider of HIPAA Risk Assessment software for practices of all sizes.  HIPAA One is a proud sponsor of EMR and HIPAA and the effort to make HIPAA compliance more accessible for all practices.  Are you HIPAA Compliant?  Take HIPAA One’s 5 minute HIPAA security and compliance quiz to see if your organization is risk or learn more at HIPAAOne.com.

Consumers Fear Theft Of Personal Health Information

Posted on February 15, 2017 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Probably fueled by constant news about breaches – duh! – consumers continue to worry that their personal health information isn’t safe, according to a new survey.

As the press release for the 2017 Xerox eHealth Survey notes, last year more than one data breach was reported each day. So it’s little wonder that the survey – which was conducted online by Harris poll in January 2017 among more than 3,000 U.S. adults – found that 44% of Americans are worried about having their PHI stolen.

According to the survey, 76% of respondents believe that it’s more secure to share PHI between providers through a secure electronic channel than to fax paper documents. This belief is certainly a plus for providers. After all, they’re already committed to sharing information as effectively as possible, and it doesn’t hurt to have consumers behind them.

Another positive finding from the study is that Americans also believe better information sharing across providers can help improve patient care. Xerox/Harris found that 87% of respondents believe that wait times to get test results and diagnoses would drop if providers securely shared and accessed patient information from varied providers. Not only that, 87% of consumers also said that they felt that quality of service would improve if information sharing and coordination among different providers was more common.

Looked at one way, these stats offer providers an opportunity. If you’re already spending tens or hundreds of millions of dollars on interoperability, it doesn’t hurt to let consumers know that you’re doing it. For example, hospitals and medical practices can put signs in their lobby spelling out what they’re doing by way of sharing data and coordinating care, have their doctors discuss what information they’re sharing and hand out sheets telling consumers how they can leverage interoperable data. (Some organizations have already taken some of these steps, but I’d argue that virtually any of them could do more.)

On the other hand, if nearly half of consumers afraid that their PHI is insecure, providers have to do more to reassure them. Though few would understand how your security program works, letting them know how seriously you take the matter is a step forward. Also, it’s good to educate them on what they can do to keep their health information secure, as people tend to be less fearful when they focus on what they can control.

That being said, the truth is that healthcare data security is a mixed bag. According to a study conducted last year by HIMSS, most organizations conduct IT security risk assessments, many IT execs have only occasional interactions with top-level leaders. Also, many are still planning out their medical device security strategy. Worse, provider security spending is often minimal. HIMSS notes that few organizations spend more than 6% of their IT budgets on data security, and 72% have five or fewer employees allocated to security.

Ultimately, it’s great to see that consumers are getting behind the idea of health data interoperability, and see how it will benefit them. But until health organizations do more to protect PHI, they’re at risk of losing that support overnight.

Hybrid Entities Ripe For HIPAA Enforcement Actions

Posted on February 8, 2017 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

As some readers will know, HIPAA rules allow large organizations to separate out parts of the organization which engage in HIPAA-covered functions from those that do not. When they follow this model, known as a “hybrid entity” under HIPAA, organizations must take care to identify the “components” of its organization which engage in functions covered by HIPAA, notes attorney Matthew Fisher in a recent article.

If they don’t, they may get into big trouble, as signs suggest that the Office for Civil Rights will be taking a closer look at these arrangements going forward, according to attorneys.  In fact, the OCR recently hit the University of Massachusetts Amherst with a $650,000 fine after a store of unsecured electronic protected health information was breached. This action, the first addressing the hybrid entity standard under HIPAA, asserted that UMass had let this data get breached because it hadn’t treated one of its departments as a healthcare component.

UMass’s troubles began in June 2013, when a workstation at the UMass Center for Language, Speech and Hearing was hit with a malware attack. The malware breach led to the disclosure of patient names, addresses, Social Security numbers, dates of birth, health insurance information and diagnoses and procedure codes for about 1,670 individuals. The attack succeeded because UMass didn’t have a firewall in place.

After investigating the matter, OCR found that UMass had failed to name the Center as a healthcare component which needed to meet HIPAA standards, and as a result had never put policies and procedures in place there to enforce HIPAA compliance. What’s more, OCR concluded that – violating HIPAA on yet another level – UMass didn’t conduct an accurate and thorough risk analysis until September 2015, well after the original breach.

In the end, things didn’t go well for the university. Not only did OCR impose a fine, it also demanded that UMass take corrective action.

According to law firm Baker Donelson, this is a clear sign that the OCR is going to begin coming down on hybrid entities that don’t protect their PHI appropriately or erect walls between healthcare components and non-components. “Hybrid designation requires precise documentation and routine updating and review,” the firm writes. “It also requires implementation of appropriate administrative, technical and physical safeguards to prevent non-healthcare components from gaining PHI access.”

And the process of selecting out healthcare components for special treatment should never end completely. The firm advises its clients review the status of components whenever they are added – such as, for example, a walk-in or community clinic – or even when new enterprise-wide systems are implemented.

My instinct is that problems like the one taking place at UMass, in which hybrid institutions struggle to separate components logically and physically, are only likely to get worse as healthcare organizations consolidate into ACOs.

I assume that under these loosely consolidated business models, individual entities will still have to mind their own security. But at the same time, if they hope to share data and coordinate care effectively, extensive network interconnections will be necessary, and mapping who can and can’t look at PHI is already tricky. I don’t know what such partners will do to keep data not only within their network, but out of the hands of non-components, but I’m sure it’ll be no picnic.

Consumers Want Their Doctors To Offer Video Visits

Posted on February 6, 2017 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

A new survey by telemedicine provider American Well has concluded that many consumers are becoming interested in video visits, and that some of consumers would be willing to switch doctors to get video visits as part of their care. Of course, given that American Well provides video visits this is a self-interested conclusion, but my gut feeling is that it’s on target nonetheless.

According to the research, 72% of parents with children under 18 were willing to see a doctor via video, as well as 72% of consumers aged 45-54 and 53% of those over age 65. Americal Well’s study also suggests that the respondents see video visits as more effective than in-person consults, with 85% reporting that a video visit resolved their issues, as compared with 64% of those seeing a doctor in a brick-and-mortar setting.

In addition, respondents said they want their existing doctors to get on board. Of those with a PCP, 65% were very or somewhat interested in conducting video visits with their PCP.  Meanwhile, 20% of consumers said they would switch doctors to get access to video visits, a number which rises to 26% among those aged 18 to 34, 30% for those aged 35 to 44 and and 34% for parents of children under age 18.

In addition to getting acute consults via video visit, 60% of respondents said that they would be willing to use them to manage a chronic condition, and 52% of adults reported that they were willing to participate in post-surgical or post-hospital-discharge visits through video.

Consumers also seemed to see video visits as a useful way to help them care for ill or aging family members. American Well found that 79% of such caregivers would find this approach helpful.

Meanwhile, large numbers of respondents seemed interested in using video visits to handle routine chronic care. The survey found that 78% of those willing to have a video visit with a doctor would be happy to manage chronic conditions via video consults with their PCP.

What the researchers draw from all of this is that it’s time for providers to start marketing video visit capabilities. Americal Well argues that by promoting these capabilities, providers can bring new patients into their systems, divert patients away from the ED and into higher-satisfaction options and improve their management of chronic conditions by making it easier for patients to stay in touch.

Ultimately, of course, providers will need to integrate video into the rest of their workflow if this channel is to mature fully. And providers will need to make sure their video visits meet the same standards as other patient interactions, including HIPAA-compliant security for the content, notes Dr. Sherry Benton of TAO Connect. Providers will also need to figure out whether the video is part of the official medical record, and if so, how they will share copies if the patient request them. But there are ways to address these issues, so they shouldn’t prevent providers from jumping in with both feet.

5 Lessons In One Big HIPAA Penalty

Posted on February 2, 2017 I Written By

The following is a guest blog post by Mike Semel, President and Chief Compliance Officer at Semel Consulting.

The federal Office for Civil Rights just announced a $ 3.2 million penalty against Children’s Medical Center of Dallas.

5 Lessons Learned from this HIPAA Penalty

  1. Don’t ignore HIPAA
  2. Cooperate with the enforcers
  3. Fix the problems you identify
  4. Encrypt your data
  5. Not everyone in your workforce should be able to access Protected Health Information

If you think complying with HIPAA isn’t important, is expensive, and annoying, do you realize you could be making a $3.2 million decision? In this one penalty there are lots of hidden and not-so-hidden messages.

1. A $ 3.2 million penalty for losing two unencrypted devices, 3 years apart.

LESSON LEARNED: Don’t ignore HIPAA.

If Children’s Medical Center was paying attention to HIPAA as it should have, it wouldn’t be out $3.2 million that should be used to treat children’s medical problems. Remember that you protecting your patients’ medical information is their Civil Right and part of their medical care.

2. This is a Civil Money Penalty, not a Case Resolution.

What’s the difference? A Civil Money Penalty is a fine. It could mean that the entity did not comply with the investigation; (as in this case) did not respond to an invitation to a hearing; or did not follow corrective requirements from a case resolution. Most HIPAA penalties are Case Resolutions, where the entity cooperates with the enforcement agency, and which usually results in a lower dollar penalty than a Civil Money Penalty.

LESSON LEARNED: Cooperate with the enforcers. No one likes the idea of a federal data breach investigation, but you could save a lot of money by cooperating and asking for leniency. Then you need to follow the requirements outlined in your Corrective Action Plan.

3. They knew they had security risks in 2007 and never addressed them until 2013, after a SECOND breach.

Children’s Medical Center had identified its risks and knew it needed to encrypt its data as far back as 2007, but had a breach of unencrypted data in 2010 and another in 2013.

LESSON LEARNED: Don’t be a SLOW LEARNER. HIPAA requires that you conduct a Security Risk Analysis AND mitigate your risks. Self-managed risk analyses can miss critical items that will result in a breach. Paying for a risk analysis and filing away the report without fixing the problems can turn into a $ 3.2 million violation. How would you explain that to your management, board of directors, your patients, and the media, if you knew about a risk and never did anything to address it? How will your management and board feel about you when they watch $3.2 million be spent on a fine?

4. There is no better way to protect data than by encrypting it.

HIPAA gives you some leeway by not requiring you to encrypt all of your devices, as long as the alternative methods to secure the data are as reliable as encryption. There’s no such thing.

If an unencrypted device is lost or stolen, you just proved that your alternative security measures weren’t effective. It amazes me how much protected data we find floating around client networks. Our clients swear that their protected data is all in their patient care system; that users are given server shares and always use them; that scanned images are directly uploaded into applications; and that they have such good physical security controls that they do not need to encrypt desktop computers and servers.

LESSON LEARNED: You must locate ALL of your data that needs to be protected, and encrypt it using an acceptable method with a tracking system. We use professional tools to scan networks looking for protected data.

5. Not everyone in your workforce needs access to Protected Health Information.

We also look at paper records storage and their movement. This week we warned a client that we thought too many workforce members had access to the rooms that store patient records. The Children’s Medical Center penalty says they secured their laptops but “provided access to the area to workforce not authorized to access ePHI.”

LESSON LEARNED: Is your Protected Health Information (on paper and in electronic form) protected against unauthorized physical access by your workforce members not authorized to access PHI?

You can plan your new career after your current organization gets hit with a preventable $ 3.2 million penalty, just like Children’s Medical Center. Or, you can take HIPAA seriously, and properly manage your risks.

Your choice.

About Mike Semel
mike-semel-hipaa-consulting
Mike Semel is the President and Chief Compliance Officer for Semel Consulting. He has owned IT businesses for over 30 years, has served as the Chief Information Officer for a hospital and a K-12 school district, and as the Chief Operating Officer for a cloud backup company. Mike is recognized as a HIPAA thought leader throughout the healthcare and IT industries, and has spoken at conferences including NASA’s Occupational Health conference, the New York State Cybersecurity conference, and many IT conferences. He has written HIPAA certification classes and consults with healthcare organizations, cloud services, Managed Service Providers, and other business associates to help build strong cybersecurity and compliance programs. Mike can be reached at 888-997-3635 x 101 or mike@semelconsulting.com.