Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

Ashley Madison Data Breach – A Lesson for Health IT

Posted on July 28, 2015 I Written By

Colin Hung is the co-founder of the #hcldr (healthcare leadership) tweetchat one of the most popular and active healthcare social media communities on Twitter. Colin is a true believer in #HealthIT, social media and empowered patients. Colin speaks, tweets and blogs regularly about healthcare, technology, marketing and leadership. He currently leads the marketing efforts for @PatientPrompt, a Stericycle product. Colin’s Twitter handle is: @Colin_Hung

The recent hack of the Ashley Madison, Cougar Life and Established Men infidelity/hookup websites has been front page news. Overnight the lives of 50 million site members (pun intended) were potentially stolen by a hacker group calling itself “The Impact Team”. The Washington Post and CNBC have great articles on the details of the hack.

As the story unfolded I became more and more fascinated, not because of the scandalous nature of the data, but because I believe this hack is a lesson for all of us that work in #HealthIT.

The value of the data that is held in EHRs and other health apps is somewhat debatable. There have been claims that a single health record is worth 10-200 times more than credit card data on the black market. The higher value is due to the potential access to prescription medications and/or the potential to use health data to commit Medicare fraud. A recent NPR post indicates that the value of a single patient’s record is approximately $470 but there is not a lot of strong evidence to support this valuation (see John Lynn’s post on this topic here).

While $470 may seem like a lot, I believe that for many patients, the reputational value of their health data is far higher. Suppose, for example you were a patient at a behavioral health clinic. You have kept your treatment secret. No one in your family or your employer know about it. Now suppose that your clinic’s EHR was breached and a hacker asked you for $470 to keep your data from being posted to the Internet. I think many would seriously consider forking over the cash.

To me this hypothetical healthcare situation is analogous to what happened with Ashley Madison. The membership data itself likely has little intrinsic value (even credit card data is only worth a few dollars). HOWEVER, the reputational value of this data is extremely high. The disruption and damage to the lives of Ashley Madison customers is enormous (though some say well deserved).

The fall-out for the company behind Ashley Madison (Avid Life Media – a Canadian company) will also be severe. They have completely lost the trust of their customers and I do not believe that any amount of market spin or heart-felt apology will be enough to save them from financial ruin.

I believe what Avid Life Media is going through is what most small-medium sized clinics and #HealthIT vendors would face if all their patient data was exposed. Patients would utterly lose faith and take their business elsewhere (though admittedly that might be a little harder if other clinic choices were not covered by your insurance). Even if the organization could afford the HHS Office for Civil Rights fines for the data breach, the impact of lost patients and lost trust would be more devastating.

With the number of health data breaches increasing, how long before healthcare has its own version of Ashley Madison? We need to do more to protect patient data, it can no longer be an after-thought. Data security and privacy need to be part of the design process of software and of healthcare organizations.

Life’s short. Secure your data!

Patient Data Breach at UCLA Hospital System Possibly Impacting 4.5 Million Patients

Posted on July 17, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

The LA Times is reporting that UCLA Health System has had a data breach possibly affecting 4.5 million patients. It’s the usual story of a HIPAA breach of this size. They saw some abnormal activity on one of their systems that contained a large amount of patient records. They don’t have any evidence that such data was taken, but hackers are usually really good about not leaving a trail when they take records.

Here’s some comments from UCLA Health as quoted in the LA Times article linked above:

“We take this attack on our systems extremely seriously,” said Dr. James Atkinson, interim associate vice chancellor and president of the UCLA Hospital System.

In an interview, Atkinson said the hospital saw unusual activity in one of its computer servers in October. An investigation confirmed in May that the hackers had gained access to patient information.

“They are a highly sophisticated group likely to be offshore,” he said. “We really don’t know. It’s an ongoing investigation.”

I have yet to see a hospital say they don’t take a breach seriously. I’ve also never seen a hospital say that they were hacked by unsophisticated hackers that exploited their poor security (although, you can be sure that happens in every industry). Of course it had to be a sophisticated attack for them to breach their amazing security, right?

What’s not clear to me is why it took them so long to confirm they’d been hacked. The LA Times article says that they saw the unusual activity in October and it took until May to confirm that “the hackers had gained access to patient information.” Now we’re just getting the public notification in July? All of that seems long, but maybe the attack was just that sophisticated.

What’s scary for me is that these types of breaches have become so common place that I’m not surprised and it’s not shocking. In fact, they’ve almost become standard. Next up will be UCLA Health System setting up some type of credit protection service for their patients assuming there was some financial data there as well. I don’t think we should treat these breaches as normal. They should be a wake up call to everyone in the industry, but I’m sorry to say that it feels more like the norm than the exception.

Does Federal Health Data Warehouse Pose Privacy Risk?

Posted on June 23, 2015 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Not too long ago, few consumers were aware of the threat data thieves posed to their privacy, and far fewer had even an inkling of how vulnerable many large commercial databases would turn out to be.

But as consumer health data has gone digital — and average people have become more aware of the extent to which data breaches can affect their lives — they’ve grown more worried, and for good reason. As a series of spectacular data breaches within health plans has illustrated, both their medical and personal data might be at risk, with potentially devastating consequences if that data gets into the wrong hands.

Considering that these concerns are not only common, but pretty valid, federal authorities who have collected information on millions of HealthCare.gov insurance customers need to be sure that they’re above reproach. Unfortunately, this doesn’t seem to be the case.

According to an Associated Press story, the administration is storing all of the HealthCare.gov data in a perpetual central repository known as MIDAS. MIDAS data includes a lot of sensitive information, including Social Security numbers, birth dates, addresses and financial accounts.  If stolen, this data could provide a springboard for countless case of identity or even medical identity theft, both of which have emerged as perhaps the iconic crimes of 21st century life.

Both the immensity of the database and a failure to plan for destruction of old records are raising the hackles of privacy advocates. They definitely aren’t comfortable with the ten-year storage period recommended by the National Archives.

An Obama Administration rep told the AP that MIDAS meets or exceeds federal security and privacy standards, by which I assume he largely meant HIPAA regs. But it’s reasonable to wonder how long the federal government can protect its massive data store, particularly if commercial entities like Anthem — who arguably have more to lose — can’t protect their beneficiaries’ data from break-ins. True, MIDAS is also operated by a private concern, government technology contractor CACI, but the workflow has to impacted by the fact that CMS owns the data.

Meanwhile, growing privacy breach questions are driven by reasonable concerns, especially those outlined by the GAO, which noted last year that MIDAS went live without an in-depth assessment of privacy risks posed by the system.

Another key point made by the AP report (which did a very good job on this topic, by the way, somewhat to my surprise) is that MIDAS’ mission has evolved from a facility for running analytics on the data to a central clearinghouse for data sharing between CMS and health insurance companies and state Medicaid organizations. And we all know that with mission creep can come feature creep; with feature creep comes greater and greater potential for security holes that are passed over and left to be found by intruders.

Now, private healthcare organizations will still be managing the bulk of consumer medical data for the near future. And they have many vulnerabilities that are left unpatched, as recent events have emphasized. But in the near term, it seems like a good idea to hold the federal government’s feet to the fire. The last thing we need is a giant loss of consumer confidence generated by a giant government data exposure.

Patients Demand the Best Care … for Their Data

Posted on June 22, 2015 I Written By

The following is a guest blog post by Art Gross, Founder of HIPAA Secure Now!.
Art Gross Headshot
Whether it’s a senior’s first fitting for a hearing aid, or a baby boomer in for a collagen injection, both are closely scrutinizing new patient forms handed to them by the office clerk.  With 100 million medical records breached and stolen to date, patients have every reason to be reluctant when they’re asked to fill out forms that require their social security number, driver’s license, insurance card and date of birth — all the ingredients for identity fraud.  Patients are so squeamish about disclosing their personal information, even Medicare has plans to remove social security numbers on patients’ benefits cards.

Now patients have as much concern about protecting their medical records as they do about receiving quality care, and they’re getting savvy about data protection.  They have every right to be assured by their physician that his practice is as concerned about their privacy as he is about their health.

But despite ongoing reports of HIPAA violations and continuous breaking news about the latest widespread patient data breach, medical practices continue to treat ePHI security as a lesser priority.  And they neglect to train front office staff so the patient who now asks a receptionist where the practice stores her records either gets a quizzical look, or is told they’re protected in an EHR but doesn’t know how, or they’re filed in a bank box in “the back room” but doesn’t know why.

In some cases, the practice may hide the fact that office staff is throwing old paper records in a dumpster.  Surprisingly this happens over and over.  Or, on the dark side, the receptionist accesses the EHR, steals patients’ social security numbers and other personal information and texts them to her criminal boyfriend for medical identity theft.

Another cybercrime threatening medical practices comes from hackers who attack a server through malware and encrypt all the medical files.  They hold the records hostage and ask for ransoms.  Medical records can vanish and the inability to access critical information about a patient’s medical condition could end up being life threatening.

Physicians should not only encrypt all mobile devices, servers and desktops, regularly review system activity, back up their servers and have a disaster recovery plan in place, etc. they should also share their security practices and policies with the patient who asks how his office is protecting her records.

Otherwise, the disgruntled patient whose question about security is dismissed won’t only complain to her friends over coffee, she’ll spread the word on Facebook.  Next time a friend on Facebook asks for a referral the patient tells her not to go to her doctor — not because he’s an incompetent surgeon but because he doesn’t know the answer when she asks specifically if the receptionist has unlimited access to her records.

And word gets out through social media that the practice is ‘behind the times.’  The doctor earns a reputation for not taking the patient’s question seriously, and for not putting the proper measures in place to secure the patient’s data.  This is the cockroach running through the restaurant that ends up on YELP.

It’s time to pull back the curtain and tell patients how you’re protecting their valuable data.  Hand them a HIPAA security fact sheet with key measures you’ve put in place to gain their confidence.  For example, our practice:

  • Performs annual risk assessments, with additional security implemented, including encryption and physical security of systems that contain patient information.
  • Shows patients that the organization has policies and procedures in place
  • Trains employees on how to watch for risks for breaches
  • Gives employees limited access to medical records
  • Backups systems daily
  • Performs system activity regularly

Practices that communicate to patients how they are protecting their information, whether it’s provided by the front office staff, stated in a fact sheet or displayed on their websites, not only instills confidence and maintains their reputations, they actually differentiate themselves in the market place and attract new patients away from competitors.

About Art Gross
Art Gross co-founded Entegration, Inc. in 2000 and serves as President and CEO. As Entegration’s medical clients adopted EHR technology Gross recognized the need to help them protect patient data and comply with complex HIPAA security regulations. Leveraging his experience supporting medical practices, in-depth knowledge of HIPAA compliance and security, and IT technology, Gross started HIPAA Secure Now! to focus on the unique IT requirements of medical practices. Email Art at artg@hippasecurenow.com.

Full Disclosure: HIPAA Secure Now! is an advertiser on EMR and HIPAA.

Windows Server 2003 Support Ends July 14, 2015 – No Longer HIPAA Compliant

Posted on June 16, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

If this post feels like groundhog day, then you are probably remembering our previous post about Windows XP being retired and therefore no longer HIPAA compliant and our follow up article about a case where “unpatched and unsupported software” was penalized by OCR as a HIPAA violation.

With those posts as background, the same thing applies to Microsoft ending support for Windows Server 2003 on July 14, 2015. Many of you are probably wondering why I’m talking about a 2003 software that’s being sunset. Could people really still be using this software in healthcare? The simple answer is that yes they are still using Windows Server 2003.

Mike Semel has a really great post about how to deal with the change to ensure you avoid any breaches or HIPAA penalties. In his post he highlights how replacing Windows Server 2003 is a much larger change than it was to replace Windows XP.

In the later case, you were disrupting one user. In the former case, you’re likely disrupting a whole group of users. Plus, the process of moving a server to a new server and operating system is much harder than moving a desktop user to a new desktop. In fact, in most cases the only reason organizations hadn’t moved off Windows XP was because of budget. My guess is that many that are still on Windows Server 2003 are still on it because the migration path to a newer server is hard or even impossible. This is why you better start planning now to move off Windows Server 2003.

I also love this section of Mike Semel’s post linked above which talks about the costs of a breach (which is likely to happen if you continue using unsupported and unpatched software):

The 2015 IBM Cost of a Data Breach Report was just released and the Ponemon Institute determined that a data breach of healthcare records averages $ 398 per record. You are thinking that it would never cost that much to notify patients, hire attorneys, and plug the holes in your network. You’re right. The report goes on to say that almost ¾ of the cost of a breach is in loss of business and other consequences of the breach. If you are a non-profit that means fewer donations. If you are a doctor or a hospital it could mean your patients lose trust and go somewhere else.

I’m sure that some will come on here like they did on the Windows XP post and suggest that you can keep using Windows Server 2003 in a HIPAA compliant manner. This penalty tells me otherwise. I believe it’s a very risky proposition to continue using unsupported and unpatched software. Might there be some edge case where a specific software requires you to use Windows Server 2003 and you could set up some mix of private network/firewalls/access lists and other security to mitigate the risk of a breach of the unsupported software. In theory, that’s possible, but it’s unlikely most of you reading this are in that position. So, you better get to work updating from Windows Server 2003.

Phase 2 HIPAA Audits Kick Off With Random Surveys

Posted on June 9, 2015 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Ideally, the only reason you would know about the following is due to scribes such as myself — but for the record, the HHS Office for Civil Rights has sent out a bunch of pre-audit screening surveys to covered entities. Once it gets responses, it will do a Phase 2 audit not only of covered entities but also business associates, so things should get heated.

While these take the form of Meaningful Use audits, covering incentives paid from January 1, 2011 through June 30, 2014, it’s really more about checking how well you protect ePHI.

This effort is a drive to be sure that providers and BAs are complying with the HIPAA privacy, security and breach notification requirements. Apparently OCR found, during Phase 1 pilot audits in 2011 and 2012, that there was “pervasive non-compliance” with regs designed to safeguard protected health information, the National Law Review reports.

However, these audits aren’t targeting the “bad guys.” Selection for the audits is random, according to HHS Office of the Inspector General.

So if you get one of the dreaded pre-screening letters, how should you respond? According a thoughtful blog post by Maryanne Lambert for CureMD, auditors will be focused on the following areas:

  • Risk Assessment audits and reports
  • EHR security plan
  • Organizational chart
  • Network diagram
  • EHR web sites and patient portals
  • Policies and procedures
  • System inventory
  • Tools to perform vulnerability scans
  • Central log and event reports
  • EHR system users list
  • Contractors supporting the EHR and network perimeter devices.

According to Lambert, the feds will want to talk to the person primarily responsible for each of these areas, a process which could quickly devolve into a disaster if those people aren’t prepared. She recommends that if you’re selected for an audit, you run through a mock audit ahead of time to make sure these staff members can answer questions about how well policies and processed are followed.

Not that anyone would take the presence of HHS on their premises lightly, but it’s worth bearing in mind that a stumble in one corner of your operation could have widespread consequences. Lambert notes that in addition to defending your security precautions, you have to make sure that all parts of your organization are in line:

Be mindful while planning for this audit as deficiencies identified for one physician in a physician group or one hospital within a multi-hospital system, may apply to the other physicians and hospitals using the same EHR system and/or implementing meaningful use in the same way.  Thus, the incentive payments at risk in this audit may be greater than the payments to the particular provider being audited.

But as she points out, there is one possible benefit to being audited. If you prepare well, it might save you not only trouble with HHS but possibly lawsuits for breaches of information. Hey, everything has some kind of silver lining, right?

Breaking Bad And HIT: Some Thoughts for Healthcare

Posted on June 2, 2015 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Recently, I’ve been re-watching the blockbuster TV series hit “Breaking Bad” courtesy of Netflix. For those who haven’t seen it, the show traces the descent of a seemingly honest plain-Joe suburbanite from high school chemistry teacher to murderous king of a multi-state crystal meth business, all kicked off by his diagnosis of terminal lung cancer.

As the show clearly intends, it has me musing once again on how an educated guy with a family and a previously crime-free life can compromise everything that once mattered to him and ultimately, destroy nearly everything he loves.

And that, given that I write for this audience, had me thinking just as deeply what turns ordinary healthcare workers into cybercriminals who ruthlessly exploit people’s privacy and put their financial survival at risk by selling the data under their control.

Sure, some of data stealing is done by black-hat hackers who crack healthcare networks and mine them for data at the behest of organized crime groups. But then there’s the surprises. Like the show’s central character, Walter White, some healthcare cybercriminals seem to come out of the blue, relative “nobodies” with no history as gangsters or thieves who suddenly find a way to rationalize stealing data.

I’d bet that if you dug into the histories of those healthcare employees who “break bad” you’d find that they have a few of the following characteristics in common:

*  Feeling underappreciated:  Like Walter White, whose lowly chemistry-teacher job was far below his abilities, data-stealing employees may feel that their talents aren’t appreciated and that they’ll never “make it” via a legitimate path.

* Having a palatable excuse:  Breaking Bad’s dying anti-hero was able to rationalize his behavior by telling himself that he was doing what he did to protect his family’s future well-being. Rogue employees who sell data to the highest bidder may believe that they’re committing a victimless crime, or that they deserve the extra income to make up for a below-market salary.

Willful ignorance:  Not once, during the entire run of BB, does White stop and wonder (out loud at least) what harm his flood of crystal meth is doing to its users. While it doesn’t take much imagination to figure out how people could be harmed by having their medical privacy violated — or especially, having their financial data abused — some healthcare workers will just choose not to think about it

Greed:  No need to explain this one — though people may restrain naturally greedy impulses if the other factors listed above aren’t present. You can’t really screen for it, sadly, despite the damage it can do.

So do you have employees in your facilities on the verge of breaking bad and betraying the trust their stewardship of healthcare data conveys? Taking a look around for bitter, dissatisfied types might be worth a try.

Healthcare Providers and Patients Deserve Better Security

Posted on June 1, 2015 I Written By

The following is a guest blog post by Anna Drachenberg, Founder and CEO of HIPAA Risk Management.
Anna Drachenberg

Our firm has been helping dentists and other healthcare providers with their HIPAA security compliance for several years. Based on our customers’ experience, many dentists lack healthcare IT partners who are committed to data security and HIPAA compliance.  Unfortunately, this lack of commitment appears to be an epidemic across healthcare IT, and healthcare providers and patients need to demand a change.

In our recent alert, Dentrix Vulnerabilities and Mitigation for HIPAA Compliance, we described two major vulnerabilities we’ve had to assist our clients in mitigating in order to protect their patients’ data and comply with our clients’ HIPAA security policies. Our regulatory and data security experts were concerned, on behalf of our clients, with the way Henry Schein handled these two issues. More concerning, this seems to be a trend with many healthcare IT companies.

From the article, “In October 2012, it was reported to the Community Emergency Response Team (CERT) that all Dentrix G5 software was installed with hard-coded credentials to access the back-end database.” Pretty serious, right? The National Vulnerability Database gave this a severity score of 5.0 and an exploitability score of 10.0.  In the CERT notification you can see that the vulnerability was credited to Justin Shafer, not the vendor, Henry Schein, and there are several months between the time that the exploit was reported (11/22/2012) until Henry Schein released a fix for the issue (2/13/2013). Read the linked article for more details on the fix Henry Schein provided.

In a time when most industries are embracing security and offering “bug bounties,” many in the healthcare IT industry are trying to ignore the problem and hope that their customers are ignoring it, too. Take the recent panic over hackers controlling airplanes. What did United Airlines do? Offer a bug bounty that pays out in airlines miles that can be redeemed for free tickets. Most software and IT companies offer similar bug bounty programs and actively cooperate with independent security professionals. These companies know that every bug that is found before it is exploited can save millions of dollars and improve their product.

I’d like to challenge all of the blog readers today to find a healthcare IT vendor who has the same approach to security. For that matter, do a search on CERT vulnerability database or the National Vulnerability Database for any healthcare software or product you know or general terms like medical, hospital, healthcare. Surprised at the lack of issues reported and fixed? Are we really supposed to believe that the healthcare IT developers are superior to other industries?

Note: The only results in a search I did on 5/30/2015 of the National Vulnerability Database for “Epic” returns vulnerabilities in the Epic Games Unreal Tournament Engine. It is good to know that my video game company cares about my data security.

Everyone who purchases, administers, and uses healthcare IT systems and software deserves vendors who are committed to security. Consider for a moment – the customers of these products are the responsible parties for ensuring the security of the data they put in to these systems. Although the change to business associates under the HIPAA Omnibus Rule puts more liability on some of these vendors, the covered entity is still ultimately responsible and takes the hit to its reputation. Patients, the ones who experience harm when these systems are breached, have to rely on their doctors and other healthcare providers to ensure that the healthcare IT software and products are secure.  I don’t know about you, but I really hope that my physician spent more time in medical school learning about medicine than he did about encryption.

It’s time for all of us in the healthcare industry to demand that our vendors have the same level of commitment to security as the healthcare providers who are their customers. It’s time for all of us as patients to demand that these vendors improve the security of the products used by our healthcare providers.

One last note. In our alert, we link to Dentrix’s notice on the type of “encryption” they offer on one of their products. From Dentrix’s article:

“Henry Schein introduced cryptographic technology in Dentrix version G5 to supplement a practice’s employee policies, physical safeguards and data security. Available only in Dentrix G5, we previously referred to this feature as encryption. Based on further review, we believe that referring to it as a data masking technique using cryptographic technology would be more appropriate. Regardless of what you call it…”

To your clients, it matters what the federal government “calls” it, and they don’t call it encryption.

About Anna Drachenberg
Anna Drachenberg has more than 20 years in the software development and healthcare regulatory fields, having held management positions at Pacificare Secure Horizons, Apex Learning and the Food and Drug Administration. Anna co-founded HRM Services, Inc., (hipaarisk.com) a data security and compliance company for healthcare. HRM offers online risk management software for HIPAA compliance and provides consulting services for covered entities and business associates. HRM has clients nationwide and also partners with IT providers, medical associations and insurance companies.

Knotty Problems Surround Substance Abuse Data Sharing via EMRs

Posted on May 27, 2015 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

As I see it, rules giving mental health and substance abuse data extra protection are critical. Maybe someday, there will be little enough stigma around these illnesses that special privacy precautions aren’t necessary, but that day is far in the future.

That’s why a new bill filed by Reps. Tim Murphy (R-PA.) and Paul Tonko (D-N.Y.), aimed at simplifying sharing of substance misuse data between EMRs, deserves a close look by those of us who track EMR data privacy. Tonko and Murphy propose to loosen federal rules on such data sharing  such that a single filled-out consent form from a patient would allow data sharing throughout a hospital or health system.

As things currently stand, federal law requires that in the majority of cases, federally-assisted substance abuse programs are barred from sharing personally-identifiable patient information with other entities if the programs don’t have a disclosure consent. What’s more, each other entity must itself obtain another consent from a patient before the data gets shared again.

At a recent hearing on the 21st Century Cures Act, Rep. Tonko argued that the federal requirements, which became law before EMRs were in wide use, were making it more difficult for individuals fighting a substance abuse problem to get the coordinated care that they needed.  While they might have been effective privacy protections at one point, today the need for patients to repeatedly approve data sharing merely interferes with the providers’ ability to offer value-based care, he suggested. (It’s hard to argue that it can’t be too great for ACOs to hit such walls.)

Clearly, Tonko’s goals can be met in some form.  In fact, other areas of the clinical world are making great progress in sharing mental health data while avoiding data privacy entanglements. For example, a couple of months ago the National Institute of Mental Health announced that its NIMH Limited Datasets project, including data from 23 large NIMH-supported clinical trials, just sent out its 300th dataset.

Rather than offer broader access to data and protect individual identifiers stringently, the datasets contain private human study participant information but are shared only with qualified researchers. Those researchers must win approval for a Data Use Certification agreement which specifies how the data may be used, including what data confidentiality and security measures must be taken.

Of course, practicing clinicians don’t have time to get special approval to see the data for every patient they treat, so this NIMH model doesn’t resolve the issues hospitals and providers face in providing coordinated substance abuse care on the fly.

But until a more flexible system is put in place, perhaps some middle ground exists in which clinicians outside of the originating institution can grant temporary, role-based “passes” offering limited use to patient-identifiable substance abuse data. That is something EMRs should be well equipped to support. And if they’re not, this would be a great time to ask why!

Emerging Health Apps Pose Major Security Risk

Posted on May 18, 2015 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

As new technologies like fitness bands, telemedicine and smartphone apps have become more important to healthcare, the issue of how to protect the privacy of the data they generate has become more important, too.

After all, all of these devices use the public Internet to broadcast data, at least at some point in the transmission. Typically, telemedicine involves a direct connection via an unsecured Internet connection with a remote server (Although, they are offering doing some sort of encryption of the data that’s being sent on the unsecured connection).  If they’re being used clinically, monitoring technologies such as fitness bands use hop from the band across wireless spectrum to a smartphone, which also uses the public Internet to communicate data to clinicians. Plus, using the public internet is just the pathway that leads to a myriad of ways that hackers could get access to this health data.

My hunch is that this exposure of data to potential thieves hasn’t generated a lot of discussion because the technology isn’t mature. And what’s more, few doctors actually work with wearables data or offer telemedicine services as a routine part of their practice.

But it won’t be long before these emerging channels for tracking and caring for patients become a standard part of medical practice.  For example, the use of wearable fitness bands is exploding, and middleware like Apple’s HealthKit is increasingly making it possible to collect and mine the data that they produce. (And the fact that Apple is working with Epic on HealthKit has lured a hefty percentage of the nation’s leading hospitals to give it a try.)

Telemedicine is growing at a monster pace as well.  One study from last year by Deloitte concluded that the market for virtual consults in 2014 would hit 70 million, and that the market for overall telemedical visits could climb to 300 million over time.

Given that the data generated by these technologies is medical, private and presumably protected by HIPAA, where’s the hue and cry over protecting this form of patient data?

After all, though a patient’s HIV or mental health status won’t be revealed by a health band’s activity status, telemedicine consults certainly can betray those concerns. And while a telemedicine consult won’t provide data on a patient’s current cardiovascular health, wearables can, and that data that might be of interest to payers or even life insurers.

I admit that when the data being broadcast isn’t clear text summaries of a patient’s condition, possibly with their personal identity, credit card and health plan information, it doesn’t seem as likely that patients’ well-being can be compromised by medical data theft.

But all you have to do is look at human nature to see the flaw in this logic. I’d argue that if medical information can be intercepted and stolen, someone can find a way to make money at it. It’d be a good idea to prepare for this eventuality before a patient’s privacy is betrayed.