Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

States Strengthen Data Breach Laws & Regulations

Posted on October 18, 2016 I Written By

The following is a guest blog post by Mike Semel, President and Chief Compliance Officer at Semel Consulting.

If your cyber security and compliance program is focused on just one regulation, like HIPAA or banking laws, many steps you are taking are probably wrong.

Since 2015 a number of states have amended their data breach laws which can affect ALL BUSINESSES, even those out of state, that store information about their residents. The changes address issues identified in breach investigations, and public displeasure with the increasing number of data breaches that can result in identity theft.

Forty-seven states, plus DC, Puerto Rico, Guam, and the US Virgin Islands, protect personally identifiable information, that includes a person’s name plus their Driver’s License number, Social Security Number, and the access information for bank and credit card accounts.

Many organizations mistakenly focus only on the data in their main business application, like an Electronic Health Record system or other database they use for patients or clients. They ignore the fact that e-mails, reports, letters, spreadsheets, scanned images, and other loose documents contain data that is also protected by laws and regulations. These documents can be anywhere – on servers, local PC’s, portable laptops, tablets, mobile phones, thumb drives, CDs and DVDs, or somewhere up in the Cloud.

Some businesses also mistakenly believe that moving data to the cloud means that they do not have to have a secure office network. This is a fallacy because your cloud can be accessed by hackers if they can compromise the local devices you use to get to the cloud. In most cases there is local data even though the main business applications are in the cloud. Local computers should have business-class operating systems, with encryption, endpoint protection software, current security patches and updates, and strong physical security. Local networks need business-class firewalls with active intrusion prevention.

States are strengthening their breach laws to make up for weaknesses in HIPAA and other federal regulations. Between a state and federal law, whichever requirement is better for the consumer is what those storing data on that state’s residents (including out of state companies) must follow.

Some states have added to the types of information protected by their data breach reporting laws. Many states give their residents the right to sue organizations for not providing adequate cyber security protection. Many states have instituted faster reporting requirements than federal laws, meaning that incident management plans that are based on federal requirements may mean you will miss a shorter state reporting deadline.

In 2014, California began requiring mandatory free identity theft prevention services even when harm cannot be proven. This year Connecticut adopted a similar standard. Tennessee eliminated the encryption safe harbor, meaning that the loss of encrypted data must be reported. Nebraska eliminated the encryption safe harbor if the encryption keys might have been compromised. Illinois is adding medical records to its list of protected information.

Massachusetts requires every business to implement a comprehensive data protection program including a written plan. Texas requires that all businesses that have medical information (not just health care providers and health plans) implement a staff training program.


Laws are not the only regulations that can affect businesses.

The New York State Department of Financial Services has proposed that “any Person operating under or required to operate under a license, registration, charter, certificate, permit, accreditation or similar authorization under the banking law, the insurance law or the financial services law” comply with new cyber security regulations. This includes banks, insurance companies, investment houses, charities, and even covers organizations like car dealers and mortgage companies who handle consumer financial information.

The new rule will require:

  • A risk analysis
  • An annual penetration test and quarterly vulnerability assessments
  • Implementation of a cyber event detection system
  • appointing a Chief Information Security Officer (and maintaining compliance responsibility if outsourcing the function)
  • System logging and event management
  • A comprehensive security program including policies, procedures, and evidence of compliance

Any organization connected to the Texas Department of Health & Human Services must agree to its Data Use Agreement, which requires that a suspected breach of some of its information be reported within ONE HOUR of discovery.


People often assume that their medical records are protected by HIPAA wherever they are, and are surprised to find out this is not the case. HIPAA only covers organizations that bill electronically for health care services, validate coverage, or act as health plans (which also includes companies that self-fund their health plans).

  • Doctors that only accept cash do not have to comply with HIPAA.
  • Companies like fitness centers and massage therapists collect your medical information but are not covered by HIPAA because they do not bill health plans.
  • Health information in employment records are exempt from HIPAA, like letters from doctors excusing an employee after an injury or illness.
  • Workers Compensation records are exempt from HIPAA.

Some states protect medical information with every entity that may store it. This means that every business must protect medical information it stores, and must report it if it is lost, stolen, or accessed by an unauthorized person.

  • Arkansas
  • California
  • Connecticut
  • Florida
  • Illinois (beginning January 1, 2017)
  • Massachusetts
  • Missouri
  • Montana
  • Nevada
  • New Hampshire
  • North Dakota
  • Oregon
  • Puerto Rico
  • Rhode Island
  • Texas
  • Virginia
  • Wyoming

Most organizations are not aware that they are governed by so many laws and regulations. They don’t realize that information about their employees and other workforce members are covered. Charities don’t realize the risks they have protecting donor information, or the impact on donations a breach can cause when it becomes public.

We have worked with many healthcare and financial organizations, as well as charities and general businesses, to build cyber security programs that comply with federal and state laws, industry regulations, contractual obligations, and insurance policy requirements. We have been certified in our compliance with the federal NIST Cyber Security Framework (CSF) and have helped others adopt this security framework, that is gaining rapid acceptance.

About Mike Semel
Mike Semel is the President and Chief Compliance Officer for Semel Consulting. He has owned IT businesses for over 30 years, has served as the Chief Information Officer for a hospital and a K-12 school district, and as the Chief Operating Officer for a cloud backup company. Mike is recognized as a HIPAA thought leader throughout the healthcare and IT industries, and has spoken at conferences including NASA’s Occupational Health conference, the New York State Cybersecurity conference, and many IT conferences. He has written HIPAA certification classes and consults with healthcare organizations, cloud services, Managed Service Providers, and other business associates to help build strong cybersecurity and compliance programs. Mike can be reached at 888-997-3635 x 101 or

KPMG: Most Business Associates Not Ready For Security Standards

Posted on October 17, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

A new study by consulting firm KPMG has concluded that two-thirds of business associates aren’t completely ready to step up to industry demands for protecting patient health information. Specifically, the majority of business associates don’t seem to be ready to meet HITRUST standards for securing protected health information. Plus, it’s worth noting that HITRUST certification doesn’t mean your organization is HIPAA compliant or protected from a breach. It’s just the first steps and many aren’t doing it.

HITRUST has established a Common Security Framework which is used by healthcare organizations (as well as others that create, access, store or exchange sensitive and/or regulated data). The CSF includes a set of controls designed to harmonize the requirements of multiple regulations and standards.

According to KPMG’s Emily Frolick, third-party risk and assurance leader for KPMG’s healthcare practice, a growing number of healthcare organizations are asking their business associates to obtain a HITRUST CSF Certification or pass an SOC 2 + HITRUST CSF examination to demonstrate that they are making a good-faith effort to protect patient information. The CSF assessment is an internal control-based approach allowing organizations such as business associates to assess and demonstrate the measures they are taken to protect healthcare data.

To see if vendors targeting the healthcare industry seemed capable of meeting these standards, KPMG surveyed 600 professionals in this category to determine their organization’s security status. The survey found that half of those responding weren’t ready for HITRUST examination or certification, while 17.4% were planning for the CSF assessment.

When asked how they were progressing toward meeting HITRUST CSF requirements, just 7% said they were completely ready. Meanwhile, 8% said their organization was well along in its implementation process, and 17.4% said they were in the early stages of CSF implementation.

One the biggest barriers to CSF readiness seems to be having adequate staff in place, ranking ahead of cultural, technological and financial concerns, KPMG found. When asked whether they had the staff in place to meet the standard, 53% said they did, but 47% said they did not have “the right staff the right level skills to execute against the HITRUST CSF.” That being said, 27% said all four factors were at issue. (Interestingly, 23% said” none of the above” posed barriers to CSF readiness.)

Readers won’t be surprised to learn that KPMG has reason to encourage vendors to seek the HITRUST cert and examination – specifically, that it works as a HITRUST Qualified CSF Assessor for healthcare organizations. Also, KPMG works with very large organizations which need to establish high levels of structure in how they evaluate their health data security measures. Hopefully this means they go well beyond what HITRUST requires.

Nonetheless, even if you work with a relatively small healthcare organization that doesn’t have the resources to engage in obtaining formal healthcare security certifications, this discussion serves as a good reminder. Particularly given that many breaches take place due to slips by business associates, it doesn’t hurt to take a close look at their security practices now and then. Even asking them some commonsense questions about how they and their contractors handle data is a good idea. After all, even if business associates cause a breach to your data, you still have to explain the breach to your patients.

HIPAA Cloud Bursts: New Guidance Proves Cloud Services Are Business Associates

Posted on October 10, 2016 I Written By

The following is a guest blog post by Mike Semel, President and Chief Compliance Officer at Semel Consulting.
It’s over. New guidance from the federal Office for Civil Rights (OCR) confirms that cloud services that store patient information must comply with HIPAA.

Many cloud services and data centers have denied their obligations by claiming they are not HIPAA Business Associates because:

  1. They have no access to their customer’s electronic Protected Health Information (ePHI),
  2. Their customer’s ePHI is encrypted and they don’t have the encryption key,
  3. They never look at their customer’s ePHI,
  4. Their customers manage the access to their own ePHI in the cloud,
  5. Their terms and conditions prohibit the storage of ePHI, and
  6. They only store ePHI ‘temporarily’ and therefore must be exempt as a ‘conduit.’

Each of these excuses has been debunked in HIPAA Cloud Guidance released on October 7, 2016, by the Office for Civil Rights.

The new guidance clearly explains that any cloud vendor that stores ePHI must:

  1. Sign a HIPAA Business Associate Agreement,
  2. Conduct a HIPAA Security Risk Analysis,
  3. Comply with the HIPAA Privacy Rule,
  4. Implement HIPAA Security Rule safeguards the ePHI to ensure its confidentiality, integrity, and availability.
  5. Comply with the HIPAA Breach Reporting Rule by reporting any breaches of ePHI to its customers, and be directly liable for breaches it has caused.

The OCR provides examples of cloud services where clients manage access to their stored data. It discusses how a client can manage its users’ access to the stored data, while the cloud service manages the security of the technical infrastructure. Each needs to have a risk analysis that relates to its share of the responsibilities.
OCR also recently published guidance that cloud services cannot block or terminate a client’s access to ePHI, for example, if they are in a dispute with their customer or the customer hasn’t paid its bill.

As we have been saying for years, the 2013 HIPAA Omnibus Final Rule expanded the definition of HIPAA Business Associates to include anyone outside a HIPAA Covered Entity’s workforce that “creates, receives, maintains, or transmits PHI” on behalf of the Covered Entity. It defines subcontractors as anyone outside of a Business Associate’s workforce that “creates, receives, maintains, or transmits PHI on behalf of another Business Associate.”

‘Maintains’ means storing ePHI, and does not distinguish whether the ePHI is encrypted, whether the Business Associate looks at the ePHI, or even if its staff has physical access to the devices housing the ePHI (like servers stored in locked cabinets in a data center.)
A small medical clinic was fined $100,000 for using a free cloud mail service to communicate ePHI, and for using a free online calendar to schedule patient visits. Recently the OCR issued a $2.7 million penalty against Oregon Health & Science University (OHSU) partly for storing ePHI with a cloud service in the absence of a Business Associate Agreement.

“OHSU should have addressed the lack of a Business Associate Agreement before allowing a vendor to store ePHI,” said OCR Director Jocelyn Samuels.  “This settlement underscores the importance of leadership engagement and why it is so critical for the C-suite to take HIPAA compliance seriously.”

So what does this mean to you?

If you are Covered Entity or a Business Associate…

  • A common myth is that all ePHI is in a structured system like an Electronic Health Record system. This is wrong because ePHI includes anything that identifies a patient, nursing home resident, or health plan member that is identifiable (many more identifiers than just a name) and relates to the treatment, diagnosis, or payment for health care.

    EPHI can be in many forms. It does not have to be in a formal system like an Electronic Health Record (EHR) system, but can be contained in an e-mail, document, spreadsheet, scanned or faxed image, medical images, photographs, and even voice files, like a patient leaving a message in your computerized phone system requesting a prescription refill. During our risk analyses we find ePHI everywhere- on servers, local devices, portable media, mobile devices, and on cloud services. Our clients are usually shocked when we show them where their ePHI is hiding.

  • Never store ePHI in any cloud service without first knowing that the service is compliant with HIPAA and will sign a HIPAA Business Associate Agreement.

    This automatically disqualifies:

    • The free texting that came with your cellular phone service;
    • Free e-mail services like Gmail, Yahoo!, Hotmail, etc.;
    • Free e-mail from your Internet service provider like Cox, Comcast, Time Warner, Charter, CenturyLink, Verizon, Frontier, etc.;
    • Free file sharing services from DropBox,, Google Drive, etc.
    • Consumer-grade online backup services.


  • Another common myth is that if data is stored in the cloud that you don’t have to secure your local devices. This is wrong because if someone can compromise a local device they can gain access to your data in the cloud. Be sure the mobile devices and local devices you use to access the cloud are properly protected, including those on your office network, and at users’ homes. This means that all mobile devices like phones and tablets; PCs; and laptops should be secured to prevent unauthorized access. All devices should be constantly updated with security patches, and anti-virus/anti-malware software should be installed and current. If ePHI is stored on a local network, it must be a domain with logging turned on, and logs retained for six years.
  • Use an e-mail service that complies with HIPAA. Microsoft Office 365 and similar business-class services advertise that they provide secure communications and will sign a HIPAA Business Associate Agreement.
  • You may be using a vendor to remotely filter your e-mail before it arrives in your e‑mail system. These services often retain a copy of each message so it can be accessed in the event your mail server goes down. Make sure your spam filtering service secures your messages and will sign a HIPAA Business Associate Agreement.


  • Never send or text ePHI, even encrypted, to a caregiver or business associate at one of the free e-mail services.
  • Never use the free texting that came with your cell service to communicate with patients and other caregivers.
  • If you have sent text messages, e-mails, or stored documents containing ePHI using an unapproved service, delete those messages now, and talk with your compliance officer.
  • Review your HIPAA compliance program, to ensure it really meets all of HIPAA’s requirements under the Privacy, Security, and Data Breach Reporting rules. There are 176 auditable HIPAA items. You may also need to comply with other federal and state laws, plus contractual and insurance requirements.

If you are a cloud service, data center, or IT Managed Service Provider …

  • If you have been denying that you are a HIPAA Business Associate, read the new guidance document and re-evaluate your decisions.
  • If you do sign HIPAA Business Associate Agreements, you need to review your internal HIPAA compliance program to ensure that it meets all of the additional requirements in the HIPAA Privacy, Security, and Data Breach Reporting rules.
  • Also become familiar with state regulations that protect personally identifiable information, including driver’s license numbers, Social Security numbers, credit card and banking information. Know which states include protection of medical information, which will require breach reporting to the state attorney general in addition to the federal government. Know what states have more stringent reporting timeframes than HIPAA. You may have to deal with a large number of states with varying laws, depending on the data you house for customers.


  • Make sure your Service Level Agreements and Terms & Conditions are not in conflict with the new guidance about blocking access to ePHI. Compare your policies for non-payment with the new guidance prohibiting locking out access to ePHI.
  • Make sure your Service Level Agreements and Terms & Conditions include how you will handle a breach caused by your clients when they are using your service. Everyone should know what will happen, and who pays, if you get dragged into a client’s data breach investigation.
  • Make sure all of your subcontractors, and their subcontractors, comply with HIPAA. This includes the data centers you use to house and/or manage your infrastructure, programmers, help desk services, and backup vendors.
  • Learn about HIPAA. We see many cloud vendors that promote their HIPAA compliance but can seldom answer even the most basic questions about the compliance requirements. Some believe they are compliant because they sign Business Associate Agreements. That is just the first step in a complex process to properly secure data and comply with the multiple regulations that affect you. We have helped many cloud services build compliance programs that protected them against significant financial risks.
  • If you have administrative access to your client’s networks that contain ePHI, you are a Business Associate. Even if your clients have not signed, or refused to sign, Business Associate Agreements, you are still a Business Associate and must follow all of the HIPAA rules.
  • If you are reselling hosting services, co-location services, cloud storage, file sharing, online backup, Office 365/hosted Exchange, e-mail encryption, or spam filtering, you need to make sure your vendors are all compliant with HIPAA and that they will sign a Business Associate Agreement with you.
  • Look at all the services your regulated clients need. Include in your project and managed service proposals clear links between your clients’ needs and your services. For example, when installing replacement equipment, describe in detail the steps you will take to properly wipe and dispose of devices being replaced that have stored any ePHI. Link your managed services to your client’s needs and include reports that directly tie to your clients’ HIPAA requirements.

About Mike Semel
Mike Semel is the President and Chief Compliance Officer for Semel Consulting. He has owned IT businesses for over 30 years, has served as the Chief Information Officer for a hospital and a K-12 school district, and as the Chief Operating Officer for a cloud backup company. Mike is recognized as a HIPAA thought leader throughout the healthcare and IT industries, and has spoken at conferences including NASA’s Occupational Health conference, the New York State Cybersecurity conference, and many IT conferences. He has written HIPAA certification classes and consults with healthcare organizations, cloud services, Managed Service Providers, and other business associates to help build strong cybersecurity and compliance programs. Mike can be reached at 888-997-3635 x 101 or

A Look At Vendor IoT Security And Vulnerability Issues

Posted on October 5, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Much of the time, when we discuss the Internet of Things, we’re looking at issues from an end-user perspective.  We talk about the potential for IoT options like mobile medical applications and wearable devices, and ponder how to connect smart devices to other nodes like the above to offer next-generation care. Though we’re only just beginning to explore such networking models, the possibilities seem nearly infinite.

That being said, most of the responsibility for enabling and securing these devices still lies with the manufacturers, as healthcare networks typically don’t integrate fully with IoT devices as of yet.

So I was intrigued to find a recent article in Dark Reading which lays out some security considerations manufacturers of IoT devices should keep in mind. Not only do the suggestions give you an idea of how vendors should be thinking about vulnerabilities, they also offer some useful insights for healthcare organizations.

Security research Lysa Myers offers IoT device-makers several recommendations to consider, including the following:

  • Notify users of any changes to device features. In fact, it may make sense to remind them repeatedly of significant changes, or they may simply ignore them out of habit.
  • Put a protocol in place for handling vulnerability reports, and display your vulnerability disclosure policy prominently on your website. Ideally, Myers notes, makers of IoT medical devices should send vulnerability reports to the FDA.
  • When determining how to handle a vulnerability issue, let the most qualified person decide what should happen. In the case of automated medical diagnosis, for example, the right person would probably be a doctor.
  • Make it quick and easy to update IoT device software when you find an error. Also, make it simple for customers to spot fraudulent updates.
  • Create an audit log for all devices, even those that might seem too mundane to interest criminals, as even the least important of devices can assist criminals in launching a DDoS attack or spamming.
  • See to it that users can tell when the changes made to an IoT device’s software are made by the authorized user or a designated representative rather than a cybercriminal or other inappropriate person.
  • Given that many IoT devices require cloud-based services to operate, it’s important to see that end users aren’t dropped abruptly with no cloud alternative. Manufacturers should give users time to transition their service if discontinuing a device, going out of business or otherwise ending support for their own cloud-based option.

If we take a high-level look at these recommendations, there’s a few common themes to be considered:

Awareness:  Particularly in the case of IoT devices, it’s critical to raise awareness among both technical staffers and users of changes, both in features and security configurations.

Protection:  It’s becoming more important every day to protect IoT devices from attacks, and see to it that they are configured properly to avoid security and continuity failures. Also, see to that these devices are protected from outages caused by vendor issues.

Monitoring:  Health IT leaders should find ways to integrate IoT devices into their monitoring routine, tracking their behavior, the state of security updates to their software and any suspicious user activity.

As the article suggests, IoT device-makers probably need to play a large role in helping healthcare organizations secure these devices. But clearly, healthcare organizations need to do their part if they hope to maintain these devices successfully as health IT models change.

Security and Privacy Are Pushing Archiving of Legacy EHR Systems

Posted on September 21, 2016 I Written By

John Lynn is the Founder of the blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of and John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

In a recent McAfee Labs Threats Report, they said that “On average, a company detects 17 data loss incidents per day.” That stat is almost too hard to comprehend. No doubt it makes HIPAA compliance officers’ heads spin.

What’s even more disturbing from a healthcare perspective is that the report identifies hospitals as the easy targets for ransomware and that the attacks are relatively unsophisticated. Plus, one of the biggest healthcare security vulnerabilities is legacy systems. This is no surprise to me since I know so many healthcare organizations that set aside, forget about, or de-prioritize security when it comes to legacy systems. Legacy system security is the ticking time bomb of HIPAA compliance for most healthcare organizations.

In a recent EHR archiving infographic and archival whitepaper, Galen Healthcare Solutions highlighted that “50% of health systems are projected to be on second-generation technology by 2020.” From a technology perspective, we’re all saying that it’s about time we shift to next generation technology in healthcare. However, from a security and privacy perspective, this move is really scary. This means that 50% of health systems are going to have to secure legacy healthcare technology. If you take into account smaller IT systems, 100% of health systems have to manage (and secure) legacy technology.

Unlike other industries where you can decommission legacy systems, the same is not true in healthcare where Federal and State laws require retention of health data for lengthy periods of time. Galen Healthcare Solutions’ infographic offered this great chart to illustrate the legacy healthcare system retention requirements across the country:

Every healthcare CIO better have a solid strategy for how they’re going to deal with legacy EHR and other health IT systems. This includes ensuring easy access to legacy data along with ensuring that the legacy system is secure.

While many health systems use to leave their legacy systems running off in the corner of their data center or a random desk in their hospital, I’m seeing more and more healthcare organizations consolidating their EHR and health IT systems into some sort of healthcare data archive. Galen Healthcare Solution has put together this really impressive whitepaper that dives into all the details associated with healthcare data archives.

There are a lot of advantages to healthcare data archives. It retains the data to meet record retention laws, provides easy access to the data by end users, and simplifies the security process since you then only have to secure one health data archive instead of multiple legacy systems. While some think that EHR data archiving is expensive, it turns out that the ROI is much better than you’d expect when you factor in the maintenance costs associated with legacy systems together with the security risks associated with these outdated systems and other compliance and access issues that come with legacy systems.

I have no doubt that as EHR vendors and health IT systems continue consolidating, we’re going to have an explosion of legacy EHR systems that need to be managed and dealt with by every healthcare organization. Those organizations that treat this lightly will likely pay the price when their legacy systems are breached and their organization is stuck in the news for all the wrong reasons.

Galen Healthcare Solutions is a sponsor of the Tackling EHR & EMR Transition Series of blog posts on Hospital EMR and EHR.

Will a Duo of AI and Machine Learning Catch Data Thieves Lurking in Hospital EHR Corridors?

Posted on September 19, 2016 I Written By

The following is a guest blog post by Santosh Varughese, President of Cognetyx, an organization devoted to using artificial intelligence and machine learning innovation to bring an end to the theft of patient medical data.
As Halloween approaches, the usual spate of horror movies will intrigue audiences across the US, replete with slashers named Jason or Freddie running amuck in the corridors of all too easily accessible hospitals. They grab a hospital gown and the zombies fit right in.  While this is just a movie you can turn off, the real horror of patient data theft can follow you.

(I know how terrible this type of crime can be. I myself have been the victim of a data theft by hackers who stole my deceased father’s medical files, running up more than $300,000 in false charges. I am still disputing on-going bills that have been accruing for the last 15 years).

Unfortunately, this horror movie scenario is similar to how data thefts often occur at medical facilities. In 2015, the healthcare industry was one of the top three hardest hit industries with serious data breaches and major attacks, along with government and manufacturers. Packed with a wealth of exploitable information such as credit card data, email addresses, Social Security numbers, employment information and medical history records, much of which will remain valid for years, if not decades and fetch a high price on the black market.

Who Are The Hackers?
It is commonly believed attacks are from outside intruders looking to steal valuable patient data and 45 percent of the hacks are external. However, “phantom” hackers are also often your colleagues, employees and business associates who are unwittingly careless in the use of passwords or lured by phishing schemes that open the door for data thieves. Not only is data stolen, but privacy violations are insidious.

The problem is not only high-tech, but also low-tech, requiring that providers across the continuum simply become smarter about data protection and privacy issues. Medical facilities are finding they must teach doctors and nurses not to click on suspicious links.

For healthcare consultants, here is a great opportunity to not only help end this industry wide problem, but build up your client base by implementing some new technologies to help medical facilities bring an end to data theft.  With EHRs being more vulnerable than ever before, CIOs and CISOs are looking for new solutions.  These range from thwarting accidental and purposeful hackers by implementing physical security procedures to securing network hardware and storage media through measures like maintaining a visitor log and installing security cameras. Also limiting physical access to server rooms and restricting the ability to remove devices from secure areas.

Of course enterprise solutions for the entire hospital system using new innovations are the best way to cast a digital safety net over all IT operations and leaving administrators and patients with a sense of security and safety.

Growing Nightmare
Medical data theft is a growing national nightmare.  IDC’s Health Insights group predicts that 1 in 3 healthcare recipients will be the victim of a medical data breach in 2016.  Other surveys found that in the last two years, 89% of healthcare organizations reported at least one data breach, with 79% reporting two or more breaches. The most commonly compromised data are medical records, followed by billing and insurance records. The average cost of a healthcare data breach is about $2.2 million.

At health insurer Anthem, Inc., foreign hackers stole up to 80 million records using social engineering to dig their way into the company’s network using the credentials of five tech workers. The hackers stole names, Social Security numbers and other sensitive information, but were thwarted when an Anthem computer system administrator discovered outsiders were using his own security credentials to log into the company system and to hack databases.

Investigators believe the hackers somehow compromised the tech worker’s security through a phishing scheme that tricked the employee into unknowingly revealing a password or downloading malicious software. Using this login information, they were able to access the company’s database and steal files.

Healthcare Hacks Spread Hospital Mayhem in Diabolical Ways
Not only is current patient data security an issue, but thieves can also drain the electronic economic blood from hospitals’ jugular vein—its IT systems. Hospitals increasingly rely on cloud delivery of big enterprise data from start-ups like iCare that can predict epidemics, cure disease, and avoid preventable deaths. They also add Personal Health Record apps to the system from fitness apps like FitBit and Jawbone.

Banner Health, operating 29 hospitals in Arizona, had to notify millions of individuals that their data was exposed. The breach began when hackers gained access to payment card processing systems at some of its food and beverage outlets. That apparently also opened the door to the attackers accessing a variety of healthcare-related information.

Because Banner Health says its breach began with an attack on payment systems, it differentiates from other recent hacker breaches. While payment system attacks have plagued the retail sector, they are almost unheard of by healthcare entities.

What also makes this breach more concerning is the question of how did hackers access healthcare systems after breaching payment systems at food/beverage facilities, when these networks should be completely separated from one another? Healthcare system networks are very complex and become more complicated as other business functions are added to the infrastructure – even those that don’t necessarily have anything to do with systems handling and protected health information.

Who hasn’t heard of “ransomware”? The first reported attack was Hollywood Presbyterian Medical Center which had its EHR and clinical information systems shut down for more than week. The systems were restored after the hospital paid $17,000 in Bitcoins.

Will Data Thieves Also Rob Us of Advances in Healthcare Technology?
Is the data theft at MedStar Health, a major healthcare system in the DC region, a foreboding sign that an industry racing to digitize and interoperate EHRs is facing a new kind of security threat that it is ill-equipped to handle? Hospitals are focused on keeping patient data from falling into the wrong hands, but attacks at MedStar and other hospitals highlight an even more frightening downside of security breaches—as hospitals strive for IT interoperability. Is this goal now a concern?

As hospitals increasingly depend on EHRs and other IT systems to coordinate care, communicate critical health data and avoid medication errors, they could also be risking patients’ well-being when hackers strike. While chasing the latest medical innovations, healthcare facilities are rapidly learning that caring for patients also means protecting their medical records and technology systems against theft and privacy violations.

“We continue the struggle to integrate EHR systems,” says anesthesiologist Dr. Donald M. Voltz, Medical Director of the Main Operating Room at Aultman Hospital in Canton, OH, and an advocate and expert on EHR interoperability. “We can’t allow patient data theft and privacy violations to become an insurmountable problem and curtail the critical technology initiative of resolving health system interoperability. Billions have been pumped into this initiative and it can’t be risked.”

Taking Healthcare Security Seriously
Healthcare is an easy target. Its security systems tend to be less mature than those of other industries, such as finance and tech. Its doctors and nurses depend on data to perform time-sensitive and life-saving work.

Where a financial-services firm might spend a third of its budget on information technology, hospitals spend only about 2% to 3%. Healthcare providers are averaging less than 6% of their information technology budget expenditures on security, according to a recent HIMSS survey. In contrast, the federal government spends 16% of its IT budget on security, while financial and banking institutions spend 12% to 15%.

Meanwhile, the number of healthcare attacks over the last five years has increased 125%, as the industry has become an easy target. Personal health information is 50 times more valuable on the black market than financial information. Stolen patient health records can fetch as much as $363 per record.

“If you’re a hacker… would you go to Fidelity or an underfunded hospital?” says John Halamka, the chief information officer of Beth Israel Deaconess Medical Center in Boston. “You’re going to go where the money is and the safe is the easiest to open.”

Many healthcare executives believe that the healthcare industry is at greater risk of breaches than other industries. Despite these concerns, many organizations have either decreased their cyber security budgets or kept them the same. While the healthcare industry has traditionally spent a small fraction of its budget on cyber defense, it has also not shored up its technical systems against hackers.

Disrupting the Healthcare Security Industry with Behavior Analysis   
Common defenses in trying to keep patient data safe have included firewalls and keeping the organization’s operating systems, software, anti-virus packages and other protective solutions up-to-date.  This task of constantly updating and patching security gaps or holes is ongoing and will invariably be less than 100% functional at any given time.  However, with only about 10% of healthcare organizations not having experienced a data breach, sophisticated hackers are clearly penetrating through these perimeter defenses and winning the healthcare data security war. So it’s time for a disruption.

Many organizations employ network surveillance tactics to prevent the misuse of login credentials. These involve the use of behavior analysis, a technique that the financial industry uses to detect credit card fraud. By adding some leading innovation, behavior analysis can offer C-suite healthcare executives a cutting-edge, game-changing innovation.

The technology relies on the proven power of cloud technology to combine artificial intelligence with machine learning algorithms to create and deploy “digital fingerprints” using ambient cognitive cyber surveillance to cast a net over EHRs and other hospital data sanctuaries. It exposes user behavior deviations while accessing EHRs and other applications with PHI that humans would miss and can not only augment current defenses against outside hackers and malicious insiders, but also flag problem employees who continually violate cyber security policy.

“Hospitals have been hit hard by data theft,” said Doug Brown, CEO, Black Book Research. “It is time for them to consider new IT security initiatives. Harnessing machine learning artificial intelligence is a smart way to sort through large amounts of data. When you unleash that technology collaboration, combined with existing cloud resources, the security parameters you build for detecting user pattern anomalies will be difficult to defeat.”

While the technology is advanced, the concept is simple. A pattern of user behavior is established and any actions that deviate from that behavior, such as logging in from a new location or accessing a part of the system the user normally doesn’t access are flagged.  Depending on the deviation, the user may be required to provide further authentication to continue or may be forbidden from proceeding until a system administrator can investigate the issue.

The cost of this technology will be positively impacted by the continuing decline in the cost of storage and processing power from cloud computing giants such as Amazon Web Services, Microsoft and Alphabet.

The healthcare data security war can be won, but it will require action and commitment from the industry. In addition to allocating adequate human and monetary resources to information security and training employees on best practices, the industry would do well to implement network surveillance that includes behavior analysis. It is the single best technological defense against the misuse of medical facility systems and the most powerful weapon the healthcare industry has in its war against cyber criminals.

Mobile Health App Makers Still Shaky On Privacy Policies

Posted on September 16, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

A new study has concluded that while mobile health app developers are developing better privacy practices, these developers vary widely in how they share those policies with consumers. The research, part of a program launched in 2011 by the Future of Privacy Forum, concludes that while mHealth app makers have improved their practices, too many are still not as clear as they could be with users as to how they handle private health information.

This year’s FPF Mobile App Study notes that mHealth players are working to make privacy policies available to users before purchase or download, by posting links on the app listing page. It probably has helped that the two major mobile health app distribution sites require apps that collect personal info to have a privacy policy in place, but consumer and government pressure has played a role as well, the report said. According to FPF researchers, mHealth app makers are beginning to explain how personal data is collected, used and shared, a step privacy advocates see as the bare minimum standard.

Researchers found that this year, 76% of top overall apps on the iOS App Store and Google Play had a privacy policy, up from 68% noted in the previous iteration of the study. In contrast, only 61% of health and fitness apps surveyed this year included a link to their privacy policies in their app store listing, 10% less than among top apps cutting across all categories.  “Given that some health and fitness apps can access sensitive, physiological data collected by sensors on a mobile phone, wearable, or other device, their below-average performance is both unexpected and troubling,” the report noted.

This disquieting lack of thorough privacy protections extended even to apps collecting some of the most intimate data, the FPF report pointed out. In particular, a subset of mHealth developers aren’t doing anything much to make their policies accessible.

For example, researchers found that while 80% of apps helping women track periods and fertility across Google Play and the iOS App Store had privacy policies, just 63% of the apps had posted links to these policies. In another niche, sleep tracking apps, only 66% of even had a privacy policy in place, and just 54% of these apps linked back to the policy on their store page. (FPF terms this level of performance “dismal,” and it’s hard to disagree.)

Underlying this analysis is the unfortunate truth that there’s still no gold standard for mHealth privacy policies. This may be due more to the complexity of the still-maturing mobile health ecosystem than resistance to creating robust policies, certainly. But either way, this issue won’t go away on its own, so mHealth app developers will need to give their privacy strategy more thought.

What Would a Patient-Centered Security Program Look Like? (Part 2 of 2)

Posted on August 30, 2016 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site ( and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

The previous part of this article laid down a basic premise that the purpose of security is to protect people, not computer systems or data. Let’s continue our exploration of internal threats.

Security Starts at Home

Before we talk about firewalls and anomaly detection for breaches, let’s ask why hospitals, pharmacies, insurers, and others can spread the data from health care records on their own by selling this data (supposedly de-identified) to all manner of third parties, without patient consent or any benefit to the patient.

This is a policy issue that calls for involvement by a wide range of actors throughout society, of course. Policy-makers have apparently already decided that it is socially beneficial–or at least the most feasible course economically–for clinicians to share data with partners helping them with treatment, operations, or payment. There are even rules now requiring those partners to protect the data. Policy-makers have further decided that de-identified data sharing is beneficial to help researchers and even companies using it to sell more treatments. What no one admits is that de-identification lies on a slope–it is not an all-or-nothing guarantee of privacy. The more widely patient data is shared, the more risk there is that someone will break the protections, and that someone’s motivation will change from relatively benign goals such as marketing to something hostile to the patient.

Were HIMSS to take a patient-centered approach to privacy, it would also ask how credentials are handed out in health care institutions, and who has the right to view patient data. How do we minimize the chance of a Peeping Tom looking at a neighbor’s record? And what about segmentation of data, so that each clinician can see only what she needs for treatment? Segmentation has been justly criticized as impractical, but observers have been asking for it for years and there’s even an HL7 guide to segmentation. Even so, it hasn’t proceeded past the pilot stage.

Nor does it make sense to talk about security unless we talk about the rights of patients to get all their data. Accuracy is related to security, and this means allowing patients to make corrections. I don’t know what I think would be worse: perfectly secure records that are plain wrong in important places, or incorrect assertions being traded around the Internet.

Patients and the Cloud

HIMSS did not ask respondents whether they stored records at their own facilities or in third-party services. For a while, trust in the cloud seemed to enjoy rapid growth–from 9% in 2012 to 40% in 2013. Another HIMSS survey found that 44% of respondents used the cloud to host clinical applications and data–but that was back in 2014, so the percentage has probably increased since then. (Every survey measures different things, of course.)

But before we investigate clinicians’ use of third parties, we must consider taking patient data out of clinicians’ hands entirely and giving it back to patients. Patients will need security training of their own, under those conditions, and will probably use the cloud to avoid catastrophic data loss. The big advantage they have over clinicians, when it comes to avoiding breaches, is that their data will be less concentrated, making it harder for intruders to grab a million records at one blow. Plenty of companies offer personal health records with some impressive features for sharing and analytics. An open source solution called HEART, described in another article, is in the works.

There’s good reason to believe that data is safer in the cloud than on local, network-connected systems. For instance, many of the complex technologies mentioned by HIMSS (network monitoring, single sign on, intrusion detection, and so on) are major configuration tasks that a cloud provider can give to its clients with a click of a button. More fundamentally, hospital IT staffs are burdened with a large set of tasks, of which security is one of the lowest-priority because it doesn’t generate revenue. In contrast, IT staff at the cloud environment spend gobs of time keeping up to date on security. They may need extra training to understand the particular regulatory requirements of health care, but the basic ways of accessing data are the same in health care as any other industry. Respondents to the HIMSS survey acknowledged that cloud systems had low vulnerability (p. 6).

There won’t be any more questions about encryption once patients have their data. When physicians want to see it, they will have to so over an encrypted path. Even Edward Snowden unreservedly boasted, “Encryption works.”

Security is a way of behaving, not a set of technologies. That fundamental attitude was not addressed by the HIMSS survey, and might not be available through any survey. HIMSS treated security as a routine corporate function, not as a patient right. We might ask the health care field different questions if we returned to the basic goal of all this security, which is the dignity and safety of the patient.

We all know the health record system is broken, and the dismal state of security is one symptom of that failure. Before we invest large sums to prop up a bad record system, let’s re-evaluate security on the basis of a realistic and respectful understanding of the patients’ rights.

2.7 Million Reasons Cloud Vendors and Data Centers ARE HIPAA Business Associates

Posted on July 25, 2016 I Written By

The following is a guest blog post by Mike Semel, President of Semel Consulting.
Cloud backup
Some cloud service providers and data centers have been in denial that they are HIPAA Business Associates. They refuse to sign Business Associate Agreements and comply with HIPAA.

Their excuses:

“We don’t have access to the data so we aren’t a HIPAA Business Associate.”

“The data is encrypted so we aren’t a HIPAA Business Associate.”

Cloud and hosted phone vendors claim “We are a conduit where the data just passes through us temporarily so we aren’t a HIPAA Business Associate.”

“We tell people not to store PHI in our cloud so we aren’t a HIPAA Business Associate.”

Wrong. Wrong. Wrong. And Wrong.

2.7 million reasons Wrong.
Oregon Health & Science University (OHSU) just paid $2.7 million to settle a series of HIPAA data breaches “including the storage of the electronic protected health information (ePHI) of over 3,000 individuals on a cloud-based server without a business associate agreement.”

Another recent penalty cost a medical practice $750,000 for sharing PHI with a vendor without having a Business Associate Agreement in place.

The 2013 changes to HIPAA that published in the Federal Register (with our emphasis) state that:

“…we have modified the definition of “business associate” to generally provide that a business associate includes a person who “creates, receives, maintains, or transmits” protected health information on behalf of a covered entity.

…an entity that maintains protected health information on behalf of a covered entity is a business associate and not a conduit, even if the entity does not actually view the protected health information.  We recognize that in both situations, the entity providing the service to the covered entity has the opportunity to access the protected health information.  However, the difference between the two situations is the transient versus persistent nature of that opportunity.  For example, a data storage company that has access to protected health information (whether digital or hard copy) qualifies as a business associate, even if the entity does not view the information or only does so on a random or infrequent basis.” 

A cloud service doesn’t need access to PHI – it just needs to manage or store it– to be a Business Associate. They must secure PHI and sign Business Associate Agreements.

The free, consumer-grade versions of DropBox and Google Drive are not HIPAA compliant. But, the fee-based cloud services, that utilize higher levels of security and for which the vendor will sign a Business Associate Agreement, are OK to use. DropBox Business and Google Apps cost more but provide both security and HIPAA compliance. Make sure you select the right service for PHI.
Encryption is a great way to protect health information, because the data is secure and the HIPAA Breach Notification Rule says that encrypted data that is lost or stolen is not a reportable breach.

However, encrypting data is not an exemption to being a Business Associate. Besides, many cloud vendors that deny they have access to encrypted data really do.

I know because I was the Chief Operating Officer for a cloud backup company. We told everyone that the client data was encrypted and we could not access it. The problem was that when someone had trouble recovering their data, the first thing our support team asked for were the encryption keys so we could help them. For medical clients that gave us access to unencrypted PHI.

I also know of situations where data was supposed to be encrypted but, because of human error, made it to the cloud unencrypted.

Simply remembering that Business Associates are covered in the HIPAA Privacy Rule while encryption is discussed in the Breach Notification Rule is an easy way to understand that encryption doesn’t cancel out a vendor’s status as a Business Associate.
27864148 - it engineer or consultant working with backup server. shot in data center.
Data Centers
A “business associate” also is a subcontractor that creates, receives, maintains, or transmits protected health information on behalf of another business associate.

Taken together, a cloud vendor that stores PHI, and the data centers that house servers and storage devices, are all HIPAA Business Associates. If you have your own servers containing PHI in a rack at a data center, that makes the data center a HIPAA Business Associate. If you use a cloud service for offsite backups, or file sharing, they and their data centers are Business Associates.

Most data centers offer ‘Network Operations Center (NOC) services,’ an on-site IT department that can go to a server rack to perform services, so you don’t have to travel (sometimes across the country) to fix a problem.  A data center manager was denying they had access to the servers locked in racks and cages, while we watched his NOC services technician open a locked rack to restart a client server.

Our client, who had its servers containing thousands of patient records housed in that data center, used the on-site NOC services when their servers needed maintenance or just to be manually restarted.
37388020 - pushing cloud computing button on touch screen
Cloud-Based and Hosted Phone Services
In the old days, a voice message left on a phone system was not tied to computers. Faxes were paper-in and paper-out between two fax machines.

HIPAA defines a conduit as a business that simply passes PHI and ePHI through their system, like the post office, FedX, UPS, phone companies and Internet Service Providers that simply transport data and do not ever store it. Paper-based faxing was exempt from HIPAA.

One way the world has changed is that Voice Over Internet Protocol (VOIP) systems, that are local or cloud-based, convert voice messages containing PHI into data files, which can then be stored for access through a portal, phone, or mobile device, or are attached to an e-mail.

Another change is that faxing PHI is now the creation of an image file, which is then transmitted through a fax number to a computer system that stores it for access through a portal, or attaches it to an e-mail.

Going back to the Federal Register statement that it is the persistence of storage that is the qualifier to be a Business Associate, the fact that the data files containing PHI are stored at the phone service means that the vendor is a Business Associate. It doesn’t matter that the PHI started out as voice messages or faxes.

RingCentral is one hosted phone vendor that now offers a HIPAA-compliant phone solution. It encrypts voice and fax files during transit and when stored, and RingCentral will sign a Business Associate Agreement.

Don’t Store PHI With Us
Telling clients not to store PHI, or stating that they are not allowed to do so in the fine print of an agreement or on a website, is just a wink-wink-nod-nod way of a cloud service or data center denying they are a Business Associate even though they know they are maintaining PHI.

Even if they refuse to work with medical clients, there are so many other types of organizations that are HIPAA Business Associates – malpractice defense law firms, accounting firms, billing companies, collections companies, insurance agents – they may as well give it up and just comply with HIPAA.

If they don’t, it can cost their clients if they are audited or through a breach investigation.

Don’t let that be you!

About Mike Semel
Mike Semel is the President of Semel Consulting, which specializes in healthcare and financial regulatory compliance, and business continuity planning.

Mike is a Certified Security Compliance Specialist, has multiple HIPAA certifications, and has authored HIPAA courseware. He has been an MSP, and the CIO for a hospital and a K-12 school district. Mike helped develop the CompTIA Security Trustmark and coaches companies preparing for the certification.

Semel Consulting conducts HIPAA workshops for MSPs and has a referrals program for partners. Visit for more info.

An Alternate Way Of Authenticating Patients

Posted on July 5, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Lately, I’ve been experimenting with a security app I downloaded to my Android phone. The app, True Key by Intel Security, allows you to log in by presenting your face for a scan or using your fingerprint. Once inside the app, you can access your preferred apps with a single click, as it stores your user name and passwords securely. Next, I simplified things further by downloading the app to my laptop and tablet, which synchs up whatever access info I enter across all devices.

From what I can see, Intel is positioning this as a direct-to-consumer play. The True Key documentation describes the app as a tool non-techies can use to access sites easily, store passwords securely and visit their favorite sites across all of their devices without re-entering authentication data. But I’m intrigued by the app’s potential for enterprise healthcare security access control.

Right now, there are serious flaws in the way application access is managed. As things stand, authentication information is usually stored in the same network infrastructure as the applications themselves, at least on a high-level basis. So the process goes like this, more or less: Untrusted device uses untrusted app to access a secure system. The secure system requests credentials from the device user, verifies them against an ID/PW database and if they are correct, logs them in.

Of course, there are alternatives to this approach, ranging from biometric-only access and instantly-generated, always-unique passwords, but few organizations have the resources to maintain super-advanced access protocols. So in reality, most enterprises have to firewall up their security and authentication databases and pray that those resources don’t get hacked. Theoretically, institutions might be able to create another hacking speed bump by storing authentication information in the cloud, but that obviously raises a host of additional security questions.

So here’s an idea. What if health IT organizations demanded that users install biometrically-locked apps like True Key on their devices? Then, enterprise HIT software could authenticate users at the device level – surely a possibility given that devices have unique IDs – and let users maintain password security at their end. That way, if an enterprise system was hacked, the attacker could gain access to device information, but wouldn’t have immediate access to a massive ID and PW database that gave them access to all system resources.

What I’m getting at, here, is that I believe healthcare organizations should maintain relationships with patients (as represented by their unique devices) rather than their ID and password. While no form of identity verification is perfect, to me it seems a lot more like that it’s really me logging in if I had to use my facial features or fingerprint as an entry point. After all, virtually any ID/PW pair chosen by a user can be guessed or hacked, but if you authenticate to my face/fingerprint and a registered device, the odds are high that you’re getting me.

So now it’s your turn, readers. What flaws do you see in this approach? Have you run into other apps that might serve this purpose better than True Key? Should HIT vendors create these apps? Have at it.