Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

Does Federal Health Data Warehouse Pose Privacy Risk?

Posted on June 23, 2015 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Not too long ago, few consumers were aware of the threat data thieves posed to their privacy, and far fewer had even an inkling of how vulnerable many large commercial databases would turn out to be.

But as consumer health data has gone digital — and average people have become more aware of the extent to which data breaches can affect their lives — they’ve grown more worried, and for good reason. As a series of spectacular data breaches within health plans has illustrated, both their medical and personal data might be at risk, with potentially devastating consequences if that data gets into the wrong hands.

Considering that these concerns are not only common, but pretty valid, federal authorities who have collected information on millions of HealthCare.gov insurance customers need to be sure that they’re above reproach. Unfortunately, this doesn’t seem to be the case.

According to an Associated Press story, the administration is storing all of the HealthCare.gov data in a perpetual central repository known as MIDAS. MIDAS data includes a lot of sensitive information, including Social Security numbers, birth dates, addresses and financial accounts.  If stolen, this data could provide a springboard for countless case of identity or even medical identity theft, both of which have emerged as perhaps the iconic crimes of 21st century life.

Both the immensity of the database and a failure to plan for destruction of old records are raising the hackles of privacy advocates. They definitely aren’t comfortable with the ten-year storage period recommended by the National Archives.

An Obama Administration rep told the AP that MIDAS meets or exceeds federal security and privacy standards, by which I assume he largely meant HIPAA regs. But it’s reasonable to wonder how long the federal government can protect its massive data store, particularly if commercial entities like Anthem — who arguably have more to lose — can’t protect their beneficiaries’ data from break-ins. True, MIDAS is also operated by a private concern, government technology contractor CACI, but the workflow has to impacted by the fact that CMS owns the data.

Meanwhile, growing privacy breach questions are driven by reasonable concerns, especially those outlined by the GAO, which noted last year that MIDAS went live without an in-depth assessment of privacy risks posed by the system.

Another key point made by the AP report (which did a very good job on this topic, by the way, somewhat to my surprise) is that MIDAS’ mission has evolved from a facility for running analytics on the data to a central clearinghouse for data sharing between CMS and health insurance companies and state Medicaid organizations. And we all know that with mission creep can come feature creep; with feature creep comes greater and greater potential for security holes that are passed over and left to be found by intruders.

Now, private healthcare organizations will still be managing the bulk of consumer medical data for the near future. And they have many vulnerabilities that are left unpatched, as recent events have emphasized. But in the near term, it seems like a good idea to hold the federal government’s feet to the fire. The last thing we need is a giant loss of consumer confidence generated by a giant government data exposure.

Patients Demand the Best Care … for Their Data

Posted on June 22, 2015 I Written By

The following is a guest blog post by Art Gross, Founder of HIPAA Secure Now!.
Art Gross Headshot
Whether it’s a senior’s first fitting for a hearing aid, or a baby boomer in for a collagen injection, both are closely scrutinizing new patient forms handed to them by the office clerk.  With 100 million medical records breached and stolen to date, patients have every reason to be reluctant when they’re asked to fill out forms that require their social security number, driver’s license, insurance card and date of birth — all the ingredients for identity fraud.  Patients are so squeamish about disclosing their personal information, even Medicare has plans to remove social security numbers on patients’ benefits cards.

Now patients have as much concern about protecting their medical records as they do about receiving quality care, and they’re getting savvy about data protection.  They have every right to be assured by their physician that his practice is as concerned about their privacy as he is about their health.

But despite ongoing reports of HIPAA violations and continuous breaking news about the latest widespread patient data breach, medical practices continue to treat ePHI security as a lesser priority.  And they neglect to train front office staff so the patient who now asks a receptionist where the practice stores her records either gets a quizzical look, or is told they’re protected in an EHR but doesn’t know how, or they’re filed in a bank box in “the back room” but doesn’t know why.

In some cases, the practice may hide the fact that office staff is throwing old paper records in a dumpster.  Surprisingly this happens over and over.  Or, on the dark side, the receptionist accesses the EHR, steals patients’ social security numbers and other personal information and texts them to her criminal boyfriend for medical identity theft.

Another cybercrime threatening medical practices comes from hackers who attack a server through malware and encrypt all the medical files.  They hold the records hostage and ask for ransoms.  Medical records can vanish and the inability to access critical information about a patient’s medical condition could end up being life threatening.

Physicians should not only encrypt all mobile devices, servers and desktops, regularly review system activity, back up their servers and have a disaster recovery plan in place, etc. they should also share their security practices and policies with the patient who asks how his office is protecting her records.

Otherwise, the disgruntled patient whose question about security is dismissed won’t only complain to her friends over coffee, she’ll spread the word on Facebook.  Next time a friend on Facebook asks for a referral the patient tells her not to go to her doctor — not because he’s an incompetent surgeon but because he doesn’t know the answer when she asks specifically if the receptionist has unlimited access to her records.

And word gets out through social media that the practice is ‘behind the times.’  The doctor earns a reputation for not taking the patient’s question seriously, and for not putting the proper measures in place to secure the patient’s data.  This is the cockroach running through the restaurant that ends up on YELP.

It’s time to pull back the curtain and tell patients how you’re protecting their valuable data.  Hand them a HIPAA security fact sheet with key measures you’ve put in place to gain their confidence.  For example, our practice:

  • Performs annual risk assessments, with additional security implemented, including encryption and physical security of systems that contain patient information.
  • Shows patients that the organization has policies and procedures in place
  • Trains employees on how to watch for risks for breaches
  • Gives employees limited access to medical records
  • Backups systems daily
  • Performs system activity regularly

Practices that communicate to patients how they are protecting their information, whether it’s provided by the front office staff, stated in a fact sheet or displayed on their websites, not only instills confidence and maintains their reputations, they actually differentiate themselves in the market place and attract new patients away from competitors.

About Art Gross
Art Gross co-founded Entegration, Inc. in 2000 and serves as President and CEO. As Entegration’s medical clients adopted EHR technology Gross recognized the need to help them protect patient data and comply with complex HIPAA security regulations. Leveraging his experience supporting medical practices, in-depth knowledge of HIPAA compliance and security, and IT technology, Gross started HIPAA Secure Now! to focus on the unique IT requirements of medical practices. Email Art at artg@hippasecurenow.com.

Full Disclosure: HIPAA Secure Now! is an advertiser on EMR and HIPAA.

Windows Server 2003 Support Ends July 14, 2015 – No Longer HIPAA Compliant

Posted on June 16, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

If this post feels like groundhog day, then you are probably remembering our previous post about Windows XP being retired and therefore no longer HIPAA compliant and our follow up article about a case where “unpatched and unsupported software” was penalized by OCR as a HIPAA violation.

With those posts as background, the same thing applies to Microsoft ending support for Windows Server 2003 on July 14, 2015. Many of you are probably wondering why I’m talking about a 2003 software that’s being sunset. Could people really still be using this software in healthcare? The simple answer is that yes they are still using Windows Server 2003.

Mike Semel has a really great post about how to deal with the change to ensure you avoid any breaches or HIPAA penalties. In his post he highlights how replacing Windows Server 2003 is a much larger change than it was to replace Windows XP.

In the later case, you were disrupting one user. In the former case, you’re likely disrupting a whole group of users. Plus, the process of moving a server to a new server and operating system is much harder than moving a desktop user to a new desktop. In fact, in most cases the only reason organizations hadn’t moved off Windows XP was because of budget. My guess is that many that are still on Windows Server 2003 are still on it because the migration path to a newer server is hard or even impossible. This is why you better start planning now to move off Windows Server 2003.

I also love this section of Mike Semel’s post linked above which talks about the costs of a breach (which is likely to happen if you continue using unsupported and unpatched software):

The 2015 IBM Cost of a Data Breach Report was just released and the Ponemon Institute determined that a data breach of healthcare records averages $ 398 per record. You are thinking that it would never cost that much to notify patients, hire attorneys, and plug the holes in your network. You’re right. The report goes on to say that almost ¾ of the cost of a breach is in loss of business and other consequences of the breach. If you are a non-profit that means fewer donations. If you are a doctor or a hospital it could mean your patients lose trust and go somewhere else.

I’m sure that some will come on here like they did on the Windows XP post and suggest that you can keep using Windows Server 2003 in a HIPAA compliant manner. This penalty tells me otherwise. I believe it’s a very risky proposition to continue using unsupported and unpatched software. Might there be some edge case where a specific software requires you to use Windows Server 2003 and you could set up some mix of private network/firewalls/access lists and other security to mitigate the risk of a breach of the unsupported software. In theory, that’s possible, but it’s unlikely most of you reading this are in that position. So, you better get to work updating from Windows Server 2003.

Breaking Bad And HIT: Some Thoughts for Healthcare

Posted on June 2, 2015 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Recently, I’ve been re-watching the blockbuster TV series hit “Breaking Bad” courtesy of Netflix. For those who haven’t seen it, the show traces the descent of a seemingly honest plain-Joe suburbanite from high school chemistry teacher to murderous king of a multi-state crystal meth business, all kicked off by his diagnosis of terminal lung cancer.

As the show clearly intends, it has me musing once again on how an educated guy with a family and a previously crime-free life can compromise everything that once mattered to him and ultimately, destroy nearly everything he loves.

And that, given that I write for this audience, had me thinking just as deeply what turns ordinary healthcare workers into cybercriminals who ruthlessly exploit people’s privacy and put their financial survival at risk by selling the data under their control.

Sure, some of data stealing is done by black-hat hackers who crack healthcare networks and mine them for data at the behest of organized crime groups. But then there’s the surprises. Like the show’s central character, Walter White, some healthcare cybercriminals seem to come out of the blue, relative “nobodies” with no history as gangsters or thieves who suddenly find a way to rationalize stealing data.

I’d bet that if you dug into the histories of those healthcare employees who “break bad” you’d find that they have a few of the following characteristics in common:

*  Feeling underappreciated:  Like Walter White, whose lowly chemistry-teacher job was far below his abilities, data-stealing employees may feel that their talents aren’t appreciated and that they’ll never “make it” via a legitimate path.

* Having a palatable excuse:  Breaking Bad’s dying anti-hero was able to rationalize his behavior by telling himself that he was doing what he did to protect his family’s future well-being. Rogue employees who sell data to the highest bidder may believe that they’re committing a victimless crime, or that they deserve the extra income to make up for a below-market salary.

Willful ignorance:  Not once, during the entire run of BB, does White stop and wonder (out loud at least) what harm his flood of crystal meth is doing to its users. While it doesn’t take much imagination to figure out how people could be harmed by having their medical privacy violated — or especially, having their financial data abused — some healthcare workers will just choose not to think about it

Greed:  No need to explain this one — though people may restrain naturally greedy impulses if the other factors listed above aren’t present. You can’t really screen for it, sadly, despite the damage it can do.

So do you have employees in your facilities on the verge of breaking bad and betraying the trust their stewardship of healthcare data conveys? Taking a look around for bitter, dissatisfied types might be worth a try.

Healthcare Providers and Patients Deserve Better Security

Posted on June 1, 2015 I Written By

The following is a guest blog post by Anna Drachenberg, Founder and CEO of HIPAA Risk Management.
Anna Drachenberg

Our firm has been helping dentists and other healthcare providers with their HIPAA security compliance for several years. Based on our customers’ experience, many dentists lack healthcare IT partners who are committed to data security and HIPAA compliance.  Unfortunately, this lack of commitment appears to be an epidemic across healthcare IT, and healthcare providers and patients need to demand a change.

In our recent alert, Dentrix Vulnerabilities and Mitigation for HIPAA Compliance, we described two major vulnerabilities we’ve had to assist our clients in mitigating in order to protect their patients’ data and comply with our clients’ HIPAA security policies. Our regulatory and data security experts were concerned, on behalf of our clients, with the way Henry Schein handled these two issues. More concerning, this seems to be a trend with many healthcare IT companies.

From the article, “In October 2012, it was reported to the Community Emergency Response Team (CERT) that all Dentrix G5 software was installed with hard-coded credentials to access the back-end database.” Pretty serious, right? The National Vulnerability Database gave this a severity score of 5.0 and an exploitability score of 10.0.  In the CERT notification you can see that the vulnerability was credited to Justin Shafer, not the vendor, Henry Schein, and there are several months between the time that the exploit was reported (11/22/2012) until Henry Schein released a fix for the issue (2/13/2013). Read the linked article for more details on the fix Henry Schein provided.

In a time when most industries are embracing security and offering “bug bounties,” many in the healthcare IT industry are trying to ignore the problem and hope that their customers are ignoring it, too. Take the recent panic over hackers controlling airplanes. What did United Airlines do? Offer a bug bounty that pays out in airlines miles that can be redeemed for free tickets. Most software and IT companies offer similar bug bounty programs and actively cooperate with independent security professionals. These companies know that every bug that is found before it is exploited can save millions of dollars and improve their product.

I’d like to challenge all of the blog readers today to find a healthcare IT vendor who has the same approach to security. For that matter, do a search on CERT vulnerability database or the National Vulnerability Database for any healthcare software or product you know or general terms like medical, hospital, healthcare. Surprised at the lack of issues reported and fixed? Are we really supposed to believe that the healthcare IT developers are superior to other industries?

Note: The only results in a search I did on 5/30/2015 of the National Vulnerability Database for “Epic” returns vulnerabilities in the Epic Games Unreal Tournament Engine. It is good to know that my video game company cares about my data security.

Everyone who purchases, administers, and uses healthcare IT systems and software deserves vendors who are committed to security. Consider for a moment – the customers of these products are the responsible parties for ensuring the security of the data they put in to these systems. Although the change to business associates under the HIPAA Omnibus Rule puts more liability on some of these vendors, the covered entity is still ultimately responsible and takes the hit to its reputation. Patients, the ones who experience harm when these systems are breached, have to rely on their doctors and other healthcare providers to ensure that the healthcare IT software and products are secure.  I don’t know about you, but I really hope that my physician spent more time in medical school learning about medicine than he did about encryption.

It’s time for all of us in the healthcare industry to demand that our vendors have the same level of commitment to security as the healthcare providers who are their customers. It’s time for all of us as patients to demand that these vendors improve the security of the products used by our healthcare providers.

One last note. In our alert, we link to Dentrix’s notice on the type of “encryption” they offer on one of their products. From Dentrix’s article:

“Henry Schein introduced cryptographic technology in Dentrix version G5 to supplement a practice’s employee policies, physical safeguards and data security. Available only in Dentrix G5, we previously referred to this feature as encryption. Based on further review, we believe that referring to it as a data masking technique using cryptographic technology would be more appropriate. Regardless of what you call it…”

To your clients, it matters what the federal government “calls” it, and they don’t call it encryption.

About Anna Drachenberg
Anna Drachenberg has more than 20 years in the software development and healthcare regulatory fields, having held management positions at Pacificare Secure Horizons, Apex Learning and the Food and Drug Administration. Anna co-founded HRM Services, Inc., (hipaarisk.com) a data security and compliance company for healthcare. HRM offers online risk management software for HIPAA compliance and provides consulting services for covered entities and business associates. HRM has clients nationwide and also partners with IT providers, medical associations and insurance companies.

Emerging Health Apps Pose Major Security Risk

Posted on May 18, 2015 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

As new technologies like fitness bands, telemedicine and smartphone apps have become more important to healthcare, the issue of how to protect the privacy of the data they generate has become more important, too.

After all, all of these devices use the public Internet to broadcast data, at least at some point in the transmission. Typically, telemedicine involves a direct connection via an unsecured Internet connection with a remote server (Although, they are offering doing some sort of encryption of the data that’s being sent on the unsecured connection).  If they’re being used clinically, monitoring technologies such as fitness bands use hop from the band across wireless spectrum to a smartphone, which also uses the public Internet to communicate data to clinicians. Plus, using the public internet is just the pathway that leads to a myriad of ways that hackers could get access to this health data.

My hunch is that this exposure of data to potential thieves hasn’t generated a lot of discussion because the technology isn’t mature. And what’s more, few doctors actually work with wearables data or offer telemedicine services as a routine part of their practice.

But it won’t be long before these emerging channels for tracking and caring for patients become a standard part of medical practice.  For example, the use of wearable fitness bands is exploding, and middleware like Apple’s HealthKit is increasingly making it possible to collect and mine the data that they produce. (And the fact that Apple is working with Epic on HealthKit has lured a hefty percentage of the nation’s leading hospitals to give it a try.)

Telemedicine is growing at a monster pace as well.  One study from last year by Deloitte concluded that the market for virtual consults in 2014 would hit 70 million, and that the market for overall telemedical visits could climb to 300 million over time.

Given that the data generated by these technologies is medical, private and presumably protected by HIPAA, where’s the hue and cry over protecting this form of patient data?

After all, though a patient’s HIV or mental health status won’t be revealed by a health band’s activity status, telemedicine consults certainly can betray those concerns. And while a telemedicine consult won’t provide data on a patient’s current cardiovascular health, wearables can, and that data that might be of interest to payers or even life insurers.

I admit that when the data being broadcast isn’t clear text summaries of a patient’s condition, possibly with their personal identity, credit card and health plan information, it doesn’t seem as likely that patients’ well-being can be compromised by medical data theft.

But all you have to do is look at human nature to see the flaw in this logic. I’d argue that if medical information can be intercepted and stolen, someone can find a way to make money at it. It’d be a good idea to prepare for this eventuality before a patient’s privacy is betrayed.

An Important Look at HIPAA Policies For BYOD

Posted on May 11, 2015 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Today I stumbled across an article which I thought readers of this blog would find noteworthy. In the article, Art Gross, president and CEO at HIPAA Secure Now!, made an important point about BYOD policies. He notes that while much of today’s corporate computing is done on mobile devices such as smartphones, laptops and tablets — most of which access their enterprise’s e-mail, network and data — HIPAA offers no advice as to how to bring those devices into compliance.

Given that most of the spectacular HIPAA breaches in recent years have arisen from the theft of laptops, and are likely proceed to theft of tablet and smartphone data, it seems strange that HHS has done nothing to update the rule to address increasing use of mobiles since it was drafted in 2003.  As Gross rightly asks, “If the HIPAA Security Rule doesn’t mention mobile devices, laptops, smartphones, email or texting how do organizations know what is required to protect these devices?”

Well, Gross’ peers have given the issue some thought, and here’s some suggestions from law firm DLA Piper on how to dissect the issues involved. BYOD challenges under HIPAA, notes author Peter McLaughlin, include:

*  Control:  To maintain protection of PHI, providers need to control many layers of computing technology, including network configuration, operating systems, device security and transmissions outside the firewall. McLaughlin notes that Android OS-based devices pose a particular challenge, as the system is often modified to meet hardware needs. And in both iOS and Android environments, IT administrators must also manage users’ tendency to connected to their preferred cloud and download their own apps. Otherwise, a large volume of protected health data can end up outside the firewall.

Compliance:  Healthcare organizations and their business associates must take care to meet HIPAA mandates regardless of the technology they  use.  But securing even basic information, much less regulated data, can be far more difficult than when the company creates restrictive rules for its own devices.

Privacy:  When enterprises let employees use their own device to do company business, it’s highly likely that the employee will feel entitled to use the device as they see fit. However, in reality, McLaughlin suggests, employees don’t really have full, private control of their devices, in part because the company policy usually requires a remote wipe of all data when the device gets lost. Also, employees might find that their device’s data becomes discoverable if the data involved is relevant to litigation.

So, readers, tell us how you’re walking the tightrope between giving employees who BYOD some autonomy, and protecting private, HIPAA-protected information.  Are you comfortable with the policies you have in place?

Full Disclosure: HIPAA Secure Now! is an advertiser on this website.

Human Error Healthcare Data Breach Infographic

Posted on March 26, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

You all know I’m a sucker for an infographic and this one illustrates a topic we’ve known for a long time: humans are one of the biggest breach challenges. All the encryption and firewalls in the world can’t solve for a human who already has access. This infographic really illustrates that point well.

Human Error and Healthcare Data Breaches
Infographic based on ICO FOI request data by Egress Software Technologies, providers of email security as well as large file transfer and encryption software.

There’s More to HIPAA Compliance Than Encryption

Posted on March 24, 2015 I Written By

The following is a guest blog post by Asaf Cidon, CEO and Co-Founder of Sookasa.
Asaf Cidon
The news that home care provider Amedisys had a HIPAA breach involving more than 100 lost laptops—even though they contained encrypted PHI—might have served as a wake-up call to many healthcare providers.  Most know by now that they need to encrypt their files to comply with HIPAA and prevent a breach. While it’s heartening to see increased focus on encryption, it’s not enough to simply encrypt data. To ensure compliance and real security, it’s critical to also manage and monitor access to protected health information.

Here’s what you should look for from any cloud-based solution to help you remain compliant.

  1. Centralized, administrative dashboard: The underlying goal of HIPAA compliance is to ensure that ­­organizations have meaningful control over their sensitive information. In that sense, a centralized dashboard is essential to provide a way for the practice to get a lens into the activities of the entire organization. HIPAA also stipulates that providers be able to get Emergency Access to necessary electronic protected health information in urgent situations, and a centralized, administrative dashboard that’s available on the web can provide just that.
  1. Audit trails: A healthcare organization should be able to track every encrypted file across the entire organization. That means logging every modification, copy, access, or share operation made to encrypted files—and associating each with a particular user.
  1. Integrity control: HIPAA rules mandate that providers be able to ensure that ePHI security hasn’t been compromised. Often, that’s an element of the audit trails. But it also means that providers should be able to preserve a complete history of confidential files to help track and recover any changes made to those files over time. This is where encryption can play a helpful role too: Encryption can render it impossible to modify files without access to the private encryption keys.
  1. Device loss / theft protection: The Amedisys situation illustrates the real risk posed by lost and stolen devices. Amedisys took the important first step of encrypting sensitive files. But it isn’t the only one to take. When a device is lost or stolen, it might seem like there’s little to be done. But steps can and should be taken to decrease the impact a breach in progress. Certain cloud security solutions provide a device block feature, which administrators can use to remotely wipe the keys associated with certain devices and users so that the sensitive information can no longer be accessed. Automatic logoff also helps, because terminating a session after a period of inactivity can help prevent unauthorized access.
  1. Employee termination help: Procedures should be implemented to prevent terminated employees from accessing ePHI. But the ability to physically block a user from accessing information takes it a step further. Technical tools such as a button that revokes or changes access permission in real-time can make a big impact.

Of course encryption is still fundamental to HIPAA compliance. In fact, it should be at the center of any sound security policy—but it’s not the only step to be taken. The right solution for your practice will integrate each of these security measures to help ensure HIPAA compliance—and overall cyber security.

About Asaf Cidon
Asaf Cidon is CEO and co-founder of cloud security company Sookasa, which encrypts, audits and controls access to files on Dropbox and connected devices, and complies with HIPAA and other regulations. Cidon holds a Ph.D. from Stanford University, where he specialized in mobile and cloud computing.

The Future Of…Healthcare Security

Posted on March 13, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

This post is part of the #HIMSS15 Blog Carnival which explores “The Future of…” across 5 different healthcare IT topics.

Security is on the top of mind of most healthcare boards. I think the instruction from these boards to CIOs is simple: Keep Us Out of the News!

That’s an order that’s much easier said than done. If Google and Anthem can’t stay out of the news because of a breach, then a hospital or doctor’s office is fighting an uphill battle. Still don’t believe me, check out this visualization of internet attacks. It’s pretty scary stuff.

The reality is that you don’t really win a security battle. You can just defend against attacks as well as possible with the limited resources you have available. What is clear is that while still limited, healthcare will be investing more resources in security and privacy than they’ve ever done before.

The future of effective security in healthcare is going to be organizations who bake security into everything they do. Instead of hiring a chief security officer that worries about and advocates for security, we need a culture of security in healthcare organizations. This starts at the top where the leader is always asking about how we’re addressing security. That leadership will then trickle down into the culture of a company.

Let’s also be clear that security doesn’t have to be at odds with innovation and technology. In fact, technology can take our approach to security and privacy to the next level. Tell me how you knew who read the chart in a paper chart world? Oh yes, that sign out sheet that people always forgot to sign. Oh wait, the fingerprints on the chart were checked. It’s almost ludicrous to think about. Let’s be real. In the paper chart world we put in processes to try to avoid the wrong people getting their hands on the chart, but we really had no idea who saw it. The opposite is true in an EHR world. We know exactly who saw what and who changed what and when and where (Note: Some EHR are better than others at this, but a few lawsuits will get them all up to par on it).

The reality is that technology can take security and privacy to another level that we could have never dreamed. We can implement granular access controls that are hard and fast and monitored and audited. That’s a powerful part of the future of security and privacy in healthcare. Remember that many of the healthcare breaches come from people who have a username and password and not from some outside hacker.

A culture of security and privacy embraces the ability to track when and what happens to every piece of PHI in their organization. Plus, this culture has to be built into the procurement process, the implementation process, the training process, etc. Gone are the days of the chief security officer scapegoat. Technology is going to show very clearly who is responsible.

While I’ve described a rosy future built around a culture of privacy and security, I’m not naive. The future of healthcare security also includes a large number of organizations who continue to live a security life of “ignorance is bliss.” These people will pay lip service to privacy and security, but won’t actually address the culture change that’s needed to address privacy and security. They’ll continue the “Just Enough Culture of HIPAA Compliance.”

In the future we’ll have to be careful to not include one organization’s ignorance in a broad description of healthcare in general. A great example of this can be learned from the Sutter Health breach. In this incident, Sutter Health CPMC found the breach during a proactive audit of their EHR. Here’s the lesson learned from that breach:

The other lesson we need to take from this HIPAA breach notification is that we shouldn’t be so quick to judge an organization that proactively discovers a breach. If we’re too punitive with healthcare organizations that find and effectively address a breach like this, then organizations will stop finding and reporting these issues. We should want healthcare organizations that have a culture and privacy and security. Part of that culture is that they’re going to sometimes catch bad actors which they need to correct.

Healthcare IT software like EHRs have a great ability to track everything that’s done and they’re only going to get better at doing it. That’s a good thing and healthcare information security and privacy will benefit from it. We should encourage rather than ridicule organizations like CPMC for their proactive efforts to take care of the privacy of their patients’ information. I hope we see more organizations like Sutter Health who take a proactive approach to the security and privacy of healthcare information.

In fact the title of the blog post linked above is a warning for the future of healthcare IT: “Will Hospitals Be At Risk for HIPAA Audits If They Don’t Have HIPAA Violations?”

Security and privacy will be part of the fabric of everything we do in healthcare IT. We can’t ignore them. In order for patients to trust these healthcare apps, security will have to be a feature. Those in healthcare IT that don’t include security as a feature will be on shaky ground.