Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

2.7 Million Reasons Cloud Vendors and Data Centers ARE HIPAA Business Associates

Posted on July 25, 2016 I Written By

The following is a guest blog post by Mike Semel, President of Semel Consulting.
Cloud backup
Some cloud service providers and data centers have been in denial that they are HIPAA Business Associates. They refuse to sign Business Associate Agreements and comply with HIPAA.

Their excuses:

“We don’t have access to the data so we aren’t a HIPAA Business Associate.”

“The data is encrypted so we aren’t a HIPAA Business Associate.”

Cloud and hosted phone vendors claim “We are a conduit where the data just passes through us temporarily so we aren’t a HIPAA Business Associate.”

“We tell people not to store PHI in our cloud so we aren’t a HIPAA Business Associate.”

Wrong. Wrong. Wrong. And Wrong.

2.7 million reasons Wrong.
Lawsuit
Oregon Health & Science University (OHSU) just paid $2.7 million to settle a series of HIPAA data breaches “including the storage of the electronic protected health information (ePHI) of over 3,000 individuals on a cloud-based server without a business associate agreement.”

Another recent penalty cost a medical practice $750,000 for sharing PHI with a vendor without having a Business Associate Agreement in place.

The 2013 changes to HIPAA that published in the Federal Register (with our emphasis) state that:

“…we have modified the definition of “business associate” to generally provide that a business associate includes a person who “creates, receives, maintains, or transmits” protected health information on behalf of a covered entity.

…an entity that maintains protected health information on behalf of a covered entity is a business associate and not a conduit, even if the entity does not actually view the protected health information.  We recognize that in both situations, the entity providing the service to the covered entity has the opportunity to access the protected health information.  However, the difference between the two situations is the transient versus persistent nature of that opportunity.  For example, a data storage company that has access to protected health information (whether digital or hard copy) qualifies as a business associate, even if the entity does not view the information or only does so on a random or infrequent basis.” 

A cloud service doesn’t need access to PHI – it just needs to manage or store it– to be a Business Associate. They must secure PHI and sign Business Associate Agreements.

The free, consumer-grade versions of DropBox and Google Drive are not HIPAA compliant. But, the fee-based cloud services, that utilize higher levels of security and for which the vendor will sign a Business Associate Agreement, are OK to use. DropBox Business and Google Apps cost more but provide both security and HIPAA compliance. Make sure you select the right service for PHI.
Encrypted
Encryption
Encryption is a great way to protect health information, because the data is secure and the HIPAA Breach Notification Rule says that encrypted data that is lost or stolen is not a reportable breach.

However, encrypting data is not an exemption to being a Business Associate. Besides, many cloud vendors that deny they have access to encrypted data really do.

I know because I was the Chief Operating Officer for a cloud backup company. We told everyone that the client data was encrypted and we could not access it. The problem was that when someone had trouble recovering their data, the first thing our support team asked for were the encryption keys so we could help them. For medical clients that gave us access to unencrypted PHI.

I also know of situations where data was supposed to be encrypted but, because of human error, made it to the cloud unencrypted.

Simply remembering that Business Associates are covered in the HIPAA Privacy Rule while encryption is discussed in the Breach Notification Rule is an easy way to understand that encryption doesn’t cancel out a vendor’s status as a Business Associate.
27864148 - it engineer or consultant working with backup server. shot in data center.
Data Centers
A “business associate” also is a subcontractor that creates, receives, maintains, or transmits protected health information on behalf of another business associate.

Taken together, a cloud vendor that stores PHI, and the data centers that house servers and storage devices, are all HIPAA Business Associates. If you have your own servers containing PHI in a rack at a data center, that makes the data center a HIPAA Business Associate. If you use a cloud service for offsite backups, or file sharing, they and their data centers are Business Associates.

Most data centers offer ‘Network Operations Center (NOC) services,’ an on-site IT department that can go to a server rack to perform services, so you don’t have to travel (sometimes across the country) to fix a problem.  A data center manager was denying they had access to the servers locked in racks and cages, while we watched his NOC services technician open a locked rack to restart a client server.

Our client, who had its servers containing thousands of patient records housed in that data center, used the on-site NOC services when their servers needed maintenance or just to be manually restarted.
37388020 - pushing cloud computing button on touch screen
Cloud-Based and Hosted Phone Services
In the old days, a voice message left on a phone system was not tied to computers. Faxes were paper-in and paper-out between two fax machines.

HIPAA defines a conduit as a business that simply passes PHI and ePHI through their system, like the post office, FedX, UPS, phone companies and Internet Service Providers that simply transport data and do not ever store it. Paper-based faxing was exempt from HIPAA.

One way the world has changed is that Voice Over Internet Protocol (VOIP) systems, that are local or cloud-based, convert voice messages containing PHI into data files, which can then be stored for access through a portal, phone, or mobile device, or are attached to an e-mail.

Another change is that faxing PHI is now the creation of an image file, which is then transmitted through a fax number to a computer system that stores it for access through a portal, or attaches it to an e-mail.

Going back to the Federal Register statement that it is the persistence of storage that is the qualifier to be a Business Associate, the fact that the data files containing PHI are stored at the phone service means that the vendor is a Business Associate. It doesn’t matter that the PHI started out as voice messages or faxes.

RingCentral is one hosted phone vendor that now offers a HIPAA-compliant phone solution. It encrypts voice and fax files during transit and when stored, and RingCentral will sign a Business Associate Agreement.

Don’t Store PHI With Us
Telling clients not to store PHI, or stating that they are not allowed to do so in the fine print of an agreement or on a website, is just a wink-wink-nod-nod way of a cloud service or data center denying they are a Business Associate even though they know they are maintaining PHI.

Even if they refuse to work with medical clients, there are so many other types of organizations that are HIPAA Business Associates – malpractice defense law firms, accounting firms, billing companies, collections companies, insurance agents – they may as well give it up and just comply with HIPAA.

If they don’t, it can cost their clients if they are audited or through a breach investigation.

Don’t let that be you!

About Mike Semel
Mike Semel is the President of Semel Consulting, which specializes in healthcare and financial regulatory compliance, and business continuity planning.

Mike is a Certified Security Compliance Specialist, has multiple HIPAA certifications, and has authored HIPAA courseware. He has been an MSP, and the CIO for a hospital and a K-12 school district. Mike helped develop the CompTIA Security Trustmark and coaches companies preparing for the certification.

Semel Consulting conducts HIPAA workshops for MSPs and has a referrals program for partners. Visit www.semelconsulting.com for more info.

Attackers Try To Sell 600K Patient Records

Posted on July 22, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

New research has concluded that attackers recently infiltrated U.S. healthcare institutions and stole at least 600,000 patient records, then attempted to sell more than 3 TB of associated data. The attacks, which were discovered by security firm InfoArmor, targeted not only hospitals, but also private clinics and vendors of medical equipment and supplies such as orthopedics, eWeek reports.

According to InfoArmor, the attacker gained access to the patient data by exploiting weak user credentials, and hacked Remote Desktop Protocol connections on some servers with static external IP addresses. The data thief also used a local privilege escalation exploit to access system files for added patching and backdooring, InfoArmor chief intelligence officer Andrew Komarov told eWeek.

And sadly, some healthcare institutions made it pretty easy for intruders. In some cases, data thieves were able to exfiltrate data stored in Microsoft Access desktop databases without any special user access segregation or rights control in place, Komarov told the magazine.

Future exploits may emerge through medical device connections, as many institutions aren’t paying enough attention to device security, he warns.”[Providers] think that the medical device is just a device for their specific function and sometimes they don’t [have] knowledge of misconfigured devices in their networks,” Komarov said.

So what will become of the data?  Many things, and none of them good. Some cyber criminals will sell Social Security numbers and other scammers will use to sell fraudulent healthcare services,. Cyber-grifters who steal a patient’s history of illness and their biography can use them to take advantage of consumers, he pointed out. And to sharpen their con, such criminals can even buy select data focused on geographic regions, Komarov noted in a follow-up chat with me.

To address exploits engineered by remote access sessions, one consulting firm is pitching technology allowing administrators to go over remote sessions with a fine-toothed comb.

Balazs Scheidler, CTO of security vendor BalaBit, notes that while remote access to internal IT resources is common, using protocols such as Microsoft Remote Desktop or Citrix ICA, IT managers don’t always have enough visibility into who’s accessing systems, when they are logging in and from where systems are being accessed. BalaBit is pitching a system which offers “CCTV-like” recording of user sessions, including screen contents, mouse movements, clicks and keystrokes.

But the truth is, regardless of what approach providers take, they simply have to step up security measures across the board. If attackers can access your data through a vulnerable Microsoft Access database, clearly something is out of order. And in fact many cases, it’s just that easy for attackers to get into your network.

NFL Players’ Medical Records Stolen

Posted on June 21, 2016 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I’d been meaning to write about this story for a while now, but finally got around to it. In case you missed it, Thousands of NFL players’ medical records were stolen. Here’s a piece of the DeadSpin summary of the incident:

In late April, the NFL recently informed its players, a Skins athletic trainer’s car was broken into. The thief took a backpack, and inside that backpack was a cache of electronic and paper medical records for thousands of players, including NFL Combine attendees from the last 13 years. That would encompass the vast majority of NFL players

The Redskins later issues this statement:

The Washington Redskins can confirm that a theft occurred mid-morning on April 15 in downtown Indianapolis, where a thief broke through the window of an athletic trainer’s locked car. No social security numbers, Protected Health Information (PHI) under HIPAA, or financial information were stolen or are at risk of exposure.

The laptop was password-protected but unencrypted, but we have no reason to believe the laptop password was compromised. The NFL’s electronic medical records system was not impacted.

It’s interesting that the Redskins said that it didn’t include any PHI that would be covered by HIPAA rules and regulations. I was interested in how HIPAA would apply to an NFL team, so I reached out to David Harlow for the answer. David Harlow, Health Blawg writer, offered these insights into whether NFL records are required to comply with HIPAA or not:

These records fall in a gray zone between employment records and health records. Clearly the NFL understands what’s at stake if, as reported, they’ve proactively reached out to the HIPAA police. At least one federal court is on record in a similar case saying, essentially, C’mon, you know you’re a covered entity; get with the program.

Michael Magrath, current Chairman, HIMSS Identity Management Task Force, and Director of Healthcare Business, VASCO Data Security offered this insight into the breach:

This is a clear example that healthcare breaches are not isolated to healthcare organizations. They apply to employers, including the National Football League. Teams secure and protect their playbooks and need to apply that philosophy to securing their players’ medical information.

Laptop thefts are common place and one of the most common entries (310 incidents) on the HHS’ Office of Civil Rights portal listing Breaches Affecting 500 or More Individuals. Encryption is one of the basic requirements to secure a laptop, yet organizations continue to gamble without it and innocent victims can face a lifetime of identity theft and medical identity theft.

Assuming the laptop was Windows based, security can be enhanced by replacing the static Windows password with two-factor authentication in the form of a one-time password. Without the authenticator to generate the one-time password, gaining entry to the laptop will be extremely difficult. By combining encryption and strong authentication to gain entry into the laptop the players and prospects protected health information would not be at risk, all because organizations and members wish to avoid few moments of inconvenience.

This story brings up some important points. First, healthcare is far from the only industry that has issues with breaches and things like stolen or lost laptops. Second, healthcare isn’t the only one that sees the importance of encrypting mobile devices. However, despite the importance, many organizations still aren’t doing so. Third, HIPAA is an interesting law since it only covers PHI and covered entities. HIPAA omnibus expanded that to business associates. However, there are still a bunch of grey areas that aren’t sure if HIPAA applies. Plus, there are a lot of white areas where your health information is stored and HIPAA doesn’t apply.

Long story short, be smart and encrypt your health data no matter where it’s stored. Be careful where you share your health data. Anyone could be breached and HIPAA will only protect you so much (covered entity or not).

Can Healthcare Ransomware Be Stopped? Yes, It Can!

Posted on May 25, 2016 I Written By

The following is a guest blog post by Steven Marco, CISA, ITIL, HP SA and President of HIPAA One®.
Steven Marco - HIPAA expert
As an Auditor at HIPAA One®, my goal is to dot every “i” and cross every “t” to ensure a comprehensive HIPAA Security Risk Analysis.  The HIPAA One® Security Risk analysis is a tool to guarantee compliance, automate risk calculations and identify high-risk technical, administrative, physical and organizational vulnerabilities.

Recently, I was on-site for a client named “Care Health” (name changed to protect their identity). Care Health had invested in the highest level of our SRA (Security Risk Analysis) to cover all aspects of security and protection from Ransomware, malware, and the proverbial “sophisticated malware.”

The HIPAA One® HIPAA Security Risk Analysis and Compliance Interview process guided Care Health through a series of HIPAA citation-based questions and required users to upload documents to demonstrate compliance.  These questions directly addressed the organization’s security controls in place to protect against ransomware and cyber-threats.  You can see a sample of the citation-driven controls HIPAA One required for malware and malicious software below:

Technical Audit Controls 164.312(b)
HIPAA One® Requirement:  Upload screenshots of the systems configuration page(s) detecting malware network communications or ePHI/PII going out/in.
Client Controls:  End-user education on malware and phishing. Cisco IPS/IPS module active to block critical threats and WebSense Filter for deep-packet web-traffic inspection.

Administrative Protection from Malicious Software 164308(a)(5)(ii)(B)
HIPAA One® Requirement:  Provide a document showing a list of all servers, workstations and other devices with updated AV Software versions.
Client Controls: BitDefender Enterprise deployed on all workstations and laptops.

Administrative Procedures to guard against malicious software 164.308(a)(5)(ii)(B)
HIPAA One® Requirement:  Please upload a list of each server and sample of PC devices containing server name, O/S version, Service pack and the most recent security updates as available by the software vendor.  Verify critical security patches are current.
Client Controls:  Microsoft Security Operations Center combined with an exhausting change-management process to test new patches prior to release.

HIPAA Citation:  Administrative Training program for workers and managers 164.308(a)(5)(i) for the HR Director role.
HIPAA One® Requirement: Please upload a screen capture of the HIPAA training system’s grades for individual employees and detail the training/grading system in notes section.  Go through training and verify it efficiently addresses organization’s Policies and Procedures with real-world threats.
Client Controls:  Training that is due and required before bonuses, pay-raises or schedule to work are awarded.  Workforce and IT Helpdesk are trained to forward any calls regarding suspicious activities to the HIPAA Security Officer (HSO).

HIPAA Security Risk Analysis Tool

Back to the Ransomware attack…One day during the project, two staff members’ in the Billing department were going about their daily tasks, which involved working with shared files in a network-mapped drive (e.g. N: drive).  One of them noticed new files were being spontaneously created and the file icons in the network folder were changing. Being attentive, she noticed one was named ransom.txt.

Acting quickly, she contacted the IT Helpdesk who were trained to triage all security-related service-desk requests immediately to the HIPAA Security Officer(HSO).   The HSO logged-into the N: shared drive and found Care Health files were slowly being encrypted!

How do you stop a Ransomware attack?
The Security officer ran Bitdefender full-scans on the Billing department computers and found nothing.  He then installed and ran Windows Defender, which has the most current malicious software removal utilities on Server 2012 and found Tescrypt.  Installing Windows Defender on the two desktops not only detected this, but also removed it.

This Ransomware variant had somehow infected the system and was encrypting these files.  The quick-acting team at Care Health recognized the attack and stopped the Tescrypt variant before patient data were compromised.  Backups were used to restore the few-dozen encrypted files on the network-drive. It was a close call, but Care Health was ready and the Crisis Averted.

Upon a configuration review of all of Care Health’s security appliances, WebSense had been configured to allow “zero-reputation” websites through.  Zero-reputation websites are new sites without a known reputation and are commonly used by hackers to send these types of attacks. At Care Health, the Ransomware apparently came from a valid website with an infected banner ad from a zero-reputation source. The banner ad was configured to trigger a client-browser download prior to the user being allowed to see the valid web page.  This forced visitors to this website to download the executable virus from the banner-ad and unknowingly installing the Ransomware on their local computer.  When downloaded, the Ransomware would start encrypting files in high-lettered network-drives first.

Lesson Learned
Ransomware is here to stay and attacks are rising.  Healthcare organizations need to have policies and procedures in place to prevent these attacks and a comprehensive user training and awareness program.  The HIPAA One® software is one of the most secure ways to implement a HIPAA Security Compliance Program.  But a risk analysis is only one step… Ultimately, organizations must build top line end-user awareness and training programs. So like at Care Health, the employees know to quickly report suspicious activities to the designated security officer to defend against Ransomware, Phishing and “sophisticated malware attacks”.

To learn more about stopping Malware and using HIPAA One® as your HIPAA Security Risk Analysis accelerator, click to learn more, or call us a 801-770-1199.

HIPAA One® is a proud sponsor of EMR and HIPAA.

OCR Cracking Down On Business Associate Security

Posted on May 13, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

For most patients, a data breach is a data breach. While it may make a big difference to a healthcare organization whether the source of a security vulnerability was outside its direct control, most consumers aren’t as picky. Once you have to disclose to them that the data has been hacked, they aren’t likely be more forgiving if one of your business associates served as the leak.

Just as importantly, federal regulators seem to be growing increasingly frustrated that healthcare organizations aren’t doing a good job of managing business associate security. It’s little wonder, given that about 20% of the 1,542 healthcare data breaches affecting 500 more individuals reported since 2009 involve business associates. (This is probably a conservative estimate, as reports to OCR by covered entities don’t always mention the involvement of a business associate.)

To this point, the HHS Office for Civil Rights has recently issued a cyber-alert stressing the urgency of addressing these issues. The alert, which was issued by OCR earlier this month, noted that a “large percentage” of covered entities assume they will not be notified of security breaches or cyberattacks experienced by the business associates. That, folks, is pretty weak sauce.

Healthcare organizations also believe that it’s difficult to manage security incidents involving business associates, and impossible to determine whether data safeguards and security policies and procedures at the business associates are adequate. Instead, it seems, many covered entities operate on the “keeping our fingers crossed” system, providing little or no business associate security oversight.

However, that is more than unwise, given that the number of major breaches have taken place because of an oversight by business associates. For example, in 2011 information on 4.9 million individuals was exposed when unencrypted backup computer tapes are stolen from the car of a Science Applications International Corp. employee, who was transporting tapes on behalf of military health program, TRICARE.

The solution to this problem is straightforward, if complex to implement, the alert suggests. “Covered entities and business associates should consider how they will confront a breach at their business associates or subcontractors,” and make detailed plans as to how they’ll address and report on security incidents among these group, OCR suggests.

Of course, in theory business associates are required to put their own policies and procedures in place to prevent, detect, contain and correct security violations under HIPAA regs. But that will be no consolation if your data is exposed because they weren’t holding their feet to the fire.

Besides, OCR isn’t just sending out vaguely threatening emails. In March, OCR began Phase 2 of its HIPAA privacy and security audits of covered entities and business associates. These audits will “review the policies and procedures adopted and employed by covered entities and their business associates to meet selected standard interpretation specifications of the Privacy, Security, and Breach Notification Rules,” OCR said at the time.

The Need for Speed (In Breach Protection)

Posted on April 26, 2016 I Written By

The following is a guest blog post by Robert Lord, Co-founder and CEO of Protenus.
Robert Protenus
The speed at which a hospital can detect a privacy breach could mean the difference between a brief, no-penalty notification and a multi-million dollar lawsuit.  This month it was reported that health information from 2,000 patients was exposed when a Texas hospital took four months to identify a data breach caused by an independent healthcare provider.  A health system in New York similarly took two months to determine that 2,500 patient records may have been exposed as a result of a phishing scam and potential breach reported two months prior.

The rise in reported breaches this year, from phishing scams to stolen patient information, only underscores the risk of lag times between breach detection and resolution. Why are lags of months and even years so common? And what can hospitals do to better prepare against threats that may reach the EHR layer?

Traditional compliance and breach detection tools are not nearly as effective as they need to be. The most widely used methods of detection involve either infrequent random audits or extensive manual searches through records following a patient complaint. For example, if a patient suspects that his medical record has been inappropriately accessed, a compliance officer must first review EMR data from the various systems involved.  Armed with a highlighter (or a large excel spreadsheet), the officer must then analyze thousands of rows of access data, and cross-reference this information with the officer’s implicit knowledge about the types of people who have permission to view that patient’s records. Finding an inconsistency – a person who accessed the records without permission – can take dozens of hours of menial work per case.  Another issue with investigating breaches based on complaints is that there is often no evidence that the breach actually occurred. Nonetheless, the hospital is legally required to investigate all claims in a timely manner, and such investigations are costly and time-consuming.

According to a study by the Ponemon Institute, it takes an average of 87 days from the time a breach occurs to the time the officer becomes aware of the problem, and, given the arduous task at hand, it then takes another 105 days for the officer to resolve the issue. In total, it takes approximately 6 months from the time a breach occurs to the time the issue is resolved. Additionally, if a data breach occurs but a patient does not notice, it could take months – or even years – for someone to discover the problem. And of course, the longer it takes the hospital to identify a problem, the higher the cost of identifying how the breach occurred and remediating the situation.

In 2013, Rouge Valley Centenary Hospital in Scarborough, Canada, revealed that the contact information of approximately 8,300 new mothers had been inappropriately accessed by two employees. Since 2009, the two employees had been selling the contact information of new mothers to a private company specializing in Registered Education Savings Plans (RESPs). Some of the patients later reported that days after coming home from the hospital with their newborn child, they started receiving calls from sales representatives at the private RESP company. Marketing representatives were extremely aggressive, and seemed to know the exact date of when their child had been born.

The most terrifying aspect of this story is how the hospital was able to find out about the data breach: remorse and human error! One employee voluntarily turned himself in, while the other accidentally left patient records on a printer. Had these two events not happened, the scam could have continued for much longer than the four years it did before it was finally discovered.

Rouge Valley Hospital is currently facing a $412 million dollar lawsuit over this breach of privacy. Arguably even more damaging, is that they have lost the trust of their patients who relied on the hospital for care and confidentiality of their medical treatments.

As exemplified by the ramifications of the Rouge Valley Hospital breach and the new breaches discovered almost weekly in hospitals around the world, the current tools used to detect privacy breaches in electronic health records are not sufficient. A system needs to have the ability to detect when employees are accessing information outside their clinical and administrative responsibilities. Had the Scarborough hospital known about the inappropriately viewed records the first time they had been accessed, they could have investigated earlier and protected the privacy of thousands of new mothers.

Every person seeks a hospital’s care has the right to privacy and the protection of their medical information. However, due to the sheer volume of patient records accessed each day, it is impossible for compliance officers to efficiently detect breaches without new and practical tools. Current rule-based analytical systems often overburden the officers with alerts, and are only a minor improvement from manual detection methods.

We are in the midst of a paradigm shift with hospitals taking a more proactive and layered approach to health data security. New technology that uses machine learning and big data science to review each access to medical records will replace traditional compliance technology and streamline threat detection and resolution cycles from months to a matter of minutes. Making identifying a privacy breach or violation as simple and fast as the action that may have caused it in the first place.  Understanding how to select and implement these next-generation tools will be a new and important challenge for the compliance officers of the future, but one that they can no longer afford to delay.

Protenus is a health data security platform that protects patient data in electronic medical records for some of the nation’s top-ranked hospitals. Using data science and machine learning, Protenus technology uniquely understands the clinical behavior and context of each user that is accessing patient data to determine the appropriateness of each action, elevating only true threats to patient privacy and health data security.

10 Health IT Security Questions Every Healthcare CIO Must Answer

Posted on April 19, 2016 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Logicalis recently sent out 10 Security Questions Every CIO Must Be Able to Answer. Here’s their list:

  1. If you knew that your company was going to be breached tomorrow, what would you do differently today?
  2. Has your company ever been breached? How do you know?
  3. What assets am I protecting, what am I protecting them from (i.e., theft, destruction, compromise), and who am I protecting them from (i.e. cybercriminals or even insiders)?
  4. What damage will we sustain if we are breached (i.e., financial loss, reputation, regulatory fines, loss of competitive advantage)?
  5. Have you moved beyond an “inside vs. outside” perimeter-based approach to information security?
  6. Does your IT security implementation match your business-centric security policies? Does it rely on written policies, technical controls or both?
  7. What is your security strategy for IoT (also known as “the Internet of threat”)?
  8. What is your security strategy for “anywhere, anytime, any device” mobility?
  9. Do you have an incident response plan in place?
  10. What is your remediation process? Can you recover lost data and prevent a similar attack from happening again?

Given the incredible rise in hospitals being breached or held ransom, it’s no surprise that this is one of the hottest topics in healthcare. No doubt many a hospital CIO has had sleepless nights thanks to these challenges. If you’re a CIO that has been sleeping well at night, I’m afraid for your organization.

The good news is that I think most healthcare organizations are taking these threats seriously. Many would now be able to answer the questions listed above. Although, I imagine some of them need some work. Maybe that’s the key lesson to all of this. There’s no silver bullet solution. Security is an ongoing process and has to be built into the culture of an organization. There’s always new threats and new software being implemented that needs to be protected.

With that said, health IT leaders need to sometimes shake things up in their organization too. A culture of security is an incredible starting point. However, there’s nothing that focuses an organization more than for a breach to occur. The hyper focus that occurs is incredible to watch. If I was a health IT leader, I’d consider staging a mock breach and see what happens. It will likely open your eyes to some poor processes and some vulnerabilities you’d missed.

Are Ransomware Attacks A HIPAA Issue, Or Just Our Fault?

Posted on April 18, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

With ransomware attacks hitting hospitals in growing numbers, it’s growing more urgent for healthcare organizations to have a routine and effective response to such attacks. While over the short term, providers are focused mostly on survival, eventually they’ll have to consider big-picture implications — and one of the biggest is whether a ransomware intrusion can be called a “breach” under federal law.

As readers know, providers must report any sizable breach to the HHS Office for Civil Rights. So far, though, it seems that the feds haven’t issued any guidance as to how they see this issue. However, people in the know have been talking about this, and here’s what they have to say.

David Holtzman, a former OCR official who now serves as vice president of compliance strategies at security firm CynergisTek, told Health Data Management that as long as the data was never compromised, a provider may be in the clear. If an organization can show OCR proof that no data was accessed, it may be able to avoid having the incident classed as a breach.

And some legal experts agree. Attorney David Harlow, who focuses on healthcare issues, told Forbes: “We need to remember that HIPAA is narrowly drawn and data breaches defined as the unauthorized ‘access, acquisition, use or disclosure’ of PHI. [And] in many cases, ransomware “wraps” PHI rather than breaches it.”

But as I see it, ransomware attacks should give health IT security pros pause even if they don’t have to report a breach to the federal government. After all, as Holtzman notes, the HIPAA security rule requires that providers put appropriate safeguards in place to ensure the confidentiality, the integrity and availability of ePHI. And fairly or not, any form of malware intrusion that succeeds raises questions about providers’ security policies and approaches.

What’s more, ransomware attacks may point to underlying weaknesses in the organization’s overall systems architecture. “Why is the operating system allowing this application to access this data?” asked one reader in comments on a related EMR and HIPAA post. “There should be no possible way for a database that is only read/write for specified applications to be modified by a foreign encryption application,” the reader noted. “The database should refuse the instruction, the OS should deny access, and the security system should lock the encryption application out.”

To be fair, not all intrusions are someone’s “fault.” Ransomware creators are innovating rapidly, and are arguably equipped to find new vectors of infection more quickly than security experts can track them. In fact, easy-to-deploy ransomware as a service is emerging, making it comparatively simple for less-skilled criminals to use. And they have a substantial incentive to do so. According to one report, one particularly sophisticated ransomware strain has brought $325 million in profits to groups deploying it.

Besides, downloading actual data is so five years ago. If you’re attacking a provider, extorting payment through ransomware is much easier than attempting to resell stolen healthcare data. Why go to all that trouble when you can get your cash up front?

Still, the reality is that healthcare organizations must be particularly careful when it comes to protecting patient privacy, both for ethical and regulatory reasons. Perhaps ransomware will be the jolt that pushes lagging players to step up and invest in security, as it creates a unique form of havoc that could easily put patient care at risk. I certainly hope so.

You Might Have a Culture of Healthcare IT Security if…

Posted on April 6, 2016 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I’ve often written that the key to really ensuring the security and privacy of data in healthcare, we need healthcare organizations to build a culture of security and privacy. It’s not just going to happen with a short term sprint.

So, I thought I’d have some fun and turn it into a list of ways for you to know if your organization has an organization of healthcare IT security or not.

You might have a culture of healthcare IT security if…your chief security officer has power to influence change.

You might have a culture of healthcare IT security if…you’ve spent time doing risk mitigation after your HIPAA risk assessment.

You might have a culture of healthcare IT security if…you’ve found breaches in your system (Note that you found them as opposed to them finding you).

You might have a culture of healthcare IT security if…you’ve turned down a company because of their inability to show you security best practices.

You might have a culture of healthcare IT security if…you’ve spent as much time on people as technology.

You might have a culture of healthcare IT security if…someone other than your chief security officer or HIPAA committee has brought a security issue to your attention.

You might have a culture of healthcare IT security if…you’ve spent a sleepless night worrying about security at your organization.

I’m sure I’m missing some obvious things. Please add to the list in the comments.

Breach Affecting 2.2M Patients Highlights New Health Data Threats

Posted on April 4, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

A Fort Myers, FL-based cancer care organization is paying a massive price for a health data breach that exposed personal information on 2.2 million patients late last year. This incident is also shedding light on the growing vulnerability of non-hospital healthcare data, as you’ll see below.

Recently, 21st Century Oncology was forced to warn patients that an “unauthorized third party” had broken into one of its databases. Officials said that they had no evidence that medical records were accessed, but conceded that breached information may have included patient names Social Security numbers, insurance information and diagnosis and treatment data.

Notably, the cancer care chain — which operates on hundred and 45 centers in 17 states — didn’t learn about the breach until the FBI informed the company that it had happened.

Since that time, 21st Century has been faced with a broad range of legal consequences. Three lawsuits related to the breach have been filed against the company. All are alleging that the breach exposed them to a great possibility of harm.  Patient indignation seems to have been stoked, in part, because they did not learn about the breach until five months after it happened, allegedly at the request of investigating FBI officials.

“While more than 2.2 million 21st Century Oncology victims have sought out and/or pay for medical care from the company, thieves have been hard at work, stealing and using their hard-to-change Social Security numbers and highly sensitive medical information,” said plaintiff Rona Polovoy in her lawsuit.

Polovoy’s suit also contends that the company should have been better prepared for such breaches, given that it suffered a similar security lapse between October 2011 and August 2012, when an employee used patient names Social Security numbers and dates of birth to file fraudulent tax refund claims. She claims that the current lapse demonstrates that the company did little to clean up its cybersecurity act.

Another plaintiff, John Dickman, says that the breach has filled his life with needless anxiety. In his legal filings he says that he “now must engage in stringent monitoring of, among other things, his financial accounts, tax filings, and health insurance claims.”

All of this may be grimly entertaining if you aren’t the one whose data was exposed, but there’s more to this case than meets the eye. According to a cybersecurity specialist quoted in Infosecurity Magazine, the 21st Century network intrusion highlights how exposed healthcare organizations outside the hospital world are to data breaches.

I can’t help but agree with TrapX Security executive vice president Carl Wright, who told the magazine that skilled nursing facilities, dialysis centers, imaging centers, diagnostic labs, surgical centers and cancer treatment facilities like 21st are all in network intruders’ crosshairs. Not only that, he notes that large extended healthcare networks such as accountable care organizations are vulnerable.

And that’s a really scary thought. While he doesn’t say so specifically, it’s logical to assume that the more unrelated partners you weld together across disparate networks, it multiplies the number of security-related points of failure. Isn’t it lovely how security threats emerge to meet every advance in healthcare?