Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

Will Data Aggregation For Precision Medicine Compromise Patient Privacy?

Posted on April 10, 2017 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Like anyone else who follows medical research, I’m fascinated by the progress of precision medicine initiatives. I often find myself explaining to relatives that in the (perhaps far distant) future, their doctor may be able to offer treatments customized specifically for them. The prospect is awe-inspiring even for me, someone who’s been researching and writing about health data for decades.

That being the case, there are problems in bringing so much personal information together into a giant database, suggests Jennifer Kulynych in an article for OUPblog, which is published by Oxford University Press. In particular, bringing together a massive trove of individual medical histories and genomes may have serious privacy implications, she says.

In arguing her point, she makes a sobering observation that rings true for me:

“A growing number of experts, particularly re-identification scientists, believe it simply isn’t possible to de-identify the genomic data and medical information needed for precision medicine. To be useful, such information can’t be modified or stripped of identifiers to the point where there’s no real risk that the data could be linked back to a patient.”

As she points out, norms in the research community make it even more likely that patients could be individually identified. For example, while a doctor might need your permission to test your blood for care, in some states it’s quite legal for a researcher to take possession of blood not needed for that care, she says. Those researchers can then sequence your genome and place that data in a research database, and the patient may never have consented to this, or even know that it happened.

And there are other, perhaps even more troubling ways in which existing laws fail to protect the privacy of patients in researchers’ data stores. For example, current research and medical regs let review boards waive patient consent or even allow researchers to call DNA sequences “de-identified” data. This flies in the face of conventional wisdom that there’s no re-identification risk, she writes.

On top of all of this, the technology already exists to leverage this information for personal identification. For example, genome sequences can potentially be re-identified through comparison to a database of identified genomes. Law enforcement organizations have already used such data to predict key aspects of an individual’s face (such as eye color and race) from genomic data.

Then there’s the issue of what happens with EMR data storage. As the author notes, healthcare organizations are increasingly adding genomic data to their stores, and sharing it widely with individuals on their network. While such practices are largely confined to academic research institutions today, this type of data use is growing, and could also expose patients to involuntary identification.

Not everyone is as concerned as Kulynych about these issues. For example, a group of researchers recently concluded that a single patient anonymization algorithm could offer a “standard” level of privacy protection to patient, even when the organizations involved are sharing clinical data. They argue that larger clinical datasets that use this approach could protect patient privacy without generalizing or suppressing data in a manner that would undermine its usefulness.

But if nothing else, it’s hard to argue Kulynych’s central concern, that too few rules have been updated to reflect the realities of big genomic and medical data stories. Clearly, state and federal rules  need to address the emerging problems associated with big data and privacy. Otherwise, by the time a major privacy breach occurs, neither patients nor researchers will have any recourse.

Consumers Fear Theft Of Personal Health Information

Posted on February 15, 2017 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Probably fueled by constant news about breaches – duh! – consumers continue to worry that their personal health information isn’t safe, according to a new survey.

As the press release for the 2017 Xerox eHealth Survey notes, last year more than one data breach was reported each day. So it’s little wonder that the survey – which was conducted online by Harris poll in January 2017 among more than 3,000 U.S. adults – found that 44% of Americans are worried about having their PHI stolen.

According to the survey, 76% of respondents believe that it’s more secure to share PHI between providers through a secure electronic channel than to fax paper documents. This belief is certainly a plus for providers. After all, they’re already committed to sharing information as effectively as possible, and it doesn’t hurt to have consumers behind them.

Another positive finding from the study is that Americans also believe better information sharing across providers can help improve patient care. Xerox/Harris found that 87% of respondents believe that wait times to get test results and diagnoses would drop if providers securely shared and accessed patient information from varied providers. Not only that, 87% of consumers also said that they felt that quality of service would improve if information sharing and coordination among different providers was more common.

Looked at one way, these stats offer providers an opportunity. If you’re already spending tens or hundreds of millions of dollars on interoperability, it doesn’t hurt to let consumers know that you’re doing it. For example, hospitals and medical practices can put signs in their lobby spelling out what they’re doing by way of sharing data and coordinating care, have their doctors discuss what information they’re sharing and hand out sheets telling consumers how they can leverage interoperable data. (Some organizations have already taken some of these steps, but I’d argue that virtually any of them could do more.)

On the other hand, if nearly half of consumers afraid that their PHI is insecure, providers have to do more to reassure them. Though few would understand how your security program works, letting them know how seriously you take the matter is a step forward. Also, it’s good to educate them on what they can do to keep their health information secure, as people tend to be less fearful when they focus on what they can control.

That being said, the truth is that healthcare data security is a mixed bag. According to a study conducted last year by HIMSS, most organizations conduct IT security risk assessments, many IT execs have only occasional interactions with top-level leaders. Also, many are still planning out their medical device security strategy. Worse, provider security spending is often minimal. HIMSS notes that few organizations spend more than 6% of their IT budgets on data security, and 72% have five or fewer employees allocated to security.

Ultimately, it’s great to see that consumers are getting behind the idea of health data interoperability, and see how it will benefit them. But until health organizations do more to protect PHI, they’re at risk of losing that support overnight.

Healthcare Robots! – #HITsm Chat Topic

Posted on January 31, 2017 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

We’re excited to share the topic and questions for this week’s #HITsm chat happening Friday, 2/3 at Noon ET (9 AM PT). This week’s chat will be hosted by Mr RIMP (@MrRimp, Robot-In-My-Pocket), mascot of the first ever #HIMSS17 Innovation Makerspace! (Booth 7785) (with assistance from @wareflo) We’ll be discussing the topic “Healthcare Robots!” and so it seems appropriate to have a robot hosting the chat.

In a first, #HIMSS17 has a #makerspace (Booth 7785), in the HIMSS17 Innovation Zone. It has robots! They are rudimentary, but educational and fun. One of those robots is @MrRIMP, for Robot-In-My-Pocket. Here is an YouTube interview with @MrRIMP. As you can tell, little Mr. R. has a bit of an attitude. He also wrote the questions below and will moderate tweets about them during the #HITsm tweetchat.

From the recent “How medical robots will change healthcare” (@PeterBNichol), there are three main areas of robotic health:

1. Direct patient care robots: surgical robots (used for performing clinical procedures), exoskeletons (for bionic extensions of self like the Ekso suit), and prosthetics (replacing lost limbs).  Over 500 people a day loses a limb in America with 2 million Americans living with limb loss according to the CDC.

2. Indirect patient care robots: pharmacy robots (streamlining automation, autonomous robots for inventory control reducing labor costs), delivery robots (providing medical goods throughout a hospital autonomously), and disinfection robots (interacting with people with known infectious diseases such as healthcare-associated infections or HAIs).

3. Home healthcare robots: robotic telepresence solutions (addressing the aging population with robotic assistance).

Before the #HITsm tweetchat I hope you’ll watch Robot & Frank, about a household robot and an increasingly infirm retiree (86% on Rotten Tomatoes, available on YouTube, Amazon, Itunes, Vudu, and Google for $2.99) I’ll also note a subcategory to the direct care robots: pediatric therapy robots. Consider, for example, New Friends 2016, The Second International Conference on Social Robots in Therapy and Education. I, Mr. RIMP, have a special interest in this area.

Join us as we discuss Healthcare Robots during the February 3rd #HITsm chat. Here are the questions we’ll discuss:

T1: What is your favorite robot movie? Why? How many years in the future would you guess it will take to achieve similar robots? #HITsm

T2: Robots promise to replace a lot of human labor. Cost-wise, humanity-wise, will this be more good than bad, or more bad than good? #HITsm

T3: Have you played with, or observed any “toy” robots. Impressed? Not impressed? Why? #HITsm

T4: IMO, “someday” normal, everyday people will be able design and program their own robots. What kind of robot would you design for healthcare? #HITsm

T5: Robots and workflow? Connections? Think about healthcare robots working *together* with healthcare workers. What are potential implications? #HITsm

Bonus: Isn’t @MrRIMP (Robot-In-My-Pocket) the cutest, funniest, little, robot you’ve ever seen? Any suggestions for the next version (V.4) of me? #HITsm

Here’s a look at the upcoming #HITsm chat schedule:
2/10 – Maximizing Your HIMSS17 Experience – Whether Attending Physically or Virtually
Hosted by Steve Sisko (@HITConfGuy and @shimcode)

2/17 – Enough talk, lets #GSD (Get Stuff Done)
Hosted by Burt Rosen (@burtrosen) from @healthsparq

2/24 – HIMSSanity Recovery Chat
With #HIMSS17 happening the week of this chat, we’ll take the week off from a formal chat. However, we encourage people that attended HIMSS or watched HIMSS remotely to share a “Tweetstorm” that tells a #HIMSS17 story, shares insights about a topic, rants on a topic of interest, or shows gratitude. Plus, it will be fun to test out a new form of tweetstorm Twitter chat. We’ll post more details as we get closer.

We look forward to learning from the #HITsm community! As always let us know if you have ideas for how to make #HITsm better.

If you’re searching for the latest #HITsm chat, you can always find the latest #HITsm chat and schedule of chats here.

Key Components of #HealthIT Strategy and Disaster Recovery – #HITsm Chat Topic

Posted on January 24, 2017 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

We’re excited to share the topic and questions for this week’s #HITsm chat happening Friday, 1/27 at Noon ET (9 AM PT). This week’s chat will be hosted by Bill Esslinger (@billesslinger) from @FogoDataCenters on the topic of “Key Components of Health IT Strategy and Disaster Recovery“.

Medical records are worth more on the Black Market than credit cards. The value is greater because a medical record contains multiple credentials that can be used by hackers more than once or twice. A medical record contains not only a social security number but additional qualifying information, allowing thieves to penetrate layers of data, and conduct multiple acts of fraud before the data is even missing.

As healthcare organizations embark on the improved use of data sets, from analytics to precision medicine and value based care, Cybersecurity rises to the number one concern for CIO’s.

How secure is your cloud based data strategy?

Consideration must be given to the different models of service

With each delivery model: Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS), comes a new set of requirements and responsibilities. The key considerations for deployment and ongoing data management include on-demand 24/7 access to critical healthcare information, support for big data and small data sets, traceability, HIPAA compliance and a thorough understanding of the healthcare environment from both a security and a legal perspective.

Join us as we discuss Key Components of #HealthIT Strategy and Disaster Recovery during the January 27th #HITsm chat.

T1: How can we prepare for the unexpected in data security? #HITsm

T2: Are we making Cybersecurity a priority in risk management? #HITsm

T3: Is Your Prevention Strategy Scalable for a Ransomware Attack? #HITsm

T4: What are the top threats regarding healthcare data today? #HITsm

T5: What Service Levels are Necessary for Redundancy in Data, Power, Cooling, and Connectivity? #HITsm

Bonus: Do you worry about the security of your health information? Why or why not? #HITsm

About Fogo Data Centers
Fogo Data Centers are SSAE16, SOCII, and HIPAA compliant as well as PCI compliant. Each site provides redundancies across all support systems. Our centers of excellence provide flexible and scalable solutions to protect your critical data and applications. Colocation at a Fogo Data Centers can ease the cost of building your own facility and maintaining your own on-site dedicated servers. Properties feature full perimeter fencing with an electric gate requiring keycard access and audio/video check-in.

Our hashtag is #KnowYourCloud. We stand ready 24/7, with years of experience, integrity and legal know-how, to protect data and securely manage your cloud strategy. In the event of a disaster or incident the Fogo team can have your facility back-up and running within hours. Call us today or take a look at our facility page to learn more.

Here’s a look at the upcoming #HITsm chat schedule:

2/3 – Healthcare Robots!
Hosted by Mr RIMP (@MrRimp, Robot-In-My-Pocket), mascot of the first ever #HIMSS17 Innovation Makerspace! (Booth 7785) (with assistance from @wareflo)

2/10 – Maximizing Your HIMSS17 Experience – Whether Attending Physically or Virtually
Hosted by Steve Sisko (@HITConfGuy and @shimcode)

2/17 – Enough talk, lets #GSD (Get Stuff Done)
Hosted by Burt Rosen (@burtrosen) from @healthsparq

2/24 – HIMSSanity Recovery Chat
With #HIMSS17 happening the week of this chat, we’ll take the week off from a formal chat. However, we encourage people that attended HIMSS or watched HIMSS remotely to share a “Tweetstorm” that tells a #HIMSS17 story, shares insights about a topic, rants on a topic of interest, or shows gratitude. Plus, it will be fun to test out a new form of tweetstorm Twitter chat. We’ll post more details as we get closer.

We look forward to learning from the #HITsm community! As always let us know if you have ideas for how to make #HITsm better.

If you’re searching for the latest #HITsm chat, you can always find the latest #HITsm chat and schedule of chats here.

E-Patient Update:  You Need Our Help

Posted on January 20, 2017 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

I just read the results of a survey by Black Book Research suggesting that many typical consumers don’t trust, like or understand health IT.

The survey, which reached out to 12,090 adult consumers in September 2016, found that 57% of those interacting with health IT at hospitals or medical practices were skeptical of its benefit. Worse, 87% said they weren’t willing to share all of their information.

Up to 70% of consumers reported that they distrusted patient portals, medical apps and EMRs. Meanwhile, while many respondents said they were interested in using health trackers, 94% said that their physicians weren’t willing or able to synch wearables data with their EMR.

On the surface, these stats are discouraging. At a minimum, they suggest that getting patients and doctors on the same page about health IT continues to be an uphill battle. But there’s a powerful tactic providers can use which – to my knowledge – hasn’t been tried with consumers.

Introducing the consumer health IT champion

As you probably know, many providers have recruited physician or nurse “champions” to help their peers understand and adjust to EMRs. I’m sure this tactic hasn’t worked perfectly for everyone who’s tried it, but it seems to have an impact. And why not? Most people are far more comfortable learning something new from someone who understands their work and shares their concerns.

The thing is, few if any providers are taking the same approach in rolling out consumer health IT. But they certainly could. I’d bet that there’s at least a few patients in every population who like, use and understand consumer health technologies, as well as having at least a sense of why providers are adopting back-end technology like EMRs. And we know how to get Great-Aunt Mildred to consider wearing a FitBit or entering data into a portal.

So why not make us your health IT champions? After all, if you asked me to, say, hold a patient workshop explaining how I use these tools in my life, and why they matter, I’d jump at the chance. E-patients like myself are by our nature evangelists, and we’re happy to share our excitement if you give us a chance. Maybe you’d need to offer some HIT power users a stipend or a gift card, but I doubt it would take much to get one of us to share our interests.

It’s worth the effort

Of course, most people who read this will probably flinch a bit, as taking this on might seem like a big hassle. But consider the following:

  • Finding such people shouldn’t be too tough. For example, I talk about wearables, mobile health options and connected health often with my PCP, and my enthusiasm for them is a little hard to miss. I doubt I’m alone in this respect.
  • All it would take to get started is to get a few of us on board. Yes, providers may have to market such events to patients, offer them coffee and snacks when they attend, and perhaps spend time evaluating the results on the back end. But we’re not talking major investments here.
  • You can’t afford to have patients fear or reject IT categorically. As value-based care becomes the standard, you’ll need their cooperation to meet your goals, and that will almost certainly include access to patient-generated data from mobile apps and wearables. People like me can address their fears and demonstrate the benefits of these technologies without making them defensive.

I hope hospitals and medical practices take advantage of people like me soon. We’re waiting in the wings, and we truly want to see the public support health IT. Let’s work together!

Connected Wearables Pose Growing Privacy, Security Risks

Posted on December 26, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

In the past, the healthcare industry treated wearables as irrelevant, distracting or worse. But over that last year or two, things have changed, with most health IT leaders concluding that wearables data has a place in their data strategies, at least in the aggregate.

The problem is, we’re making the transition to wearable data collection so quickly that some important privacy and security issues aren’t being addressed, according to a new report by American University and the Center for Digital Democracy. The report, Health Wearable Devices in the Big Data Era: Ensuring Privacy, Security, and Consumer Protection, concludes that the “weak and fragmented” patchwork of state and federal health privacy regulations doesn’t really address the problems created by wearables.

The researchers note that as smart watches, wearable health trackers, sensor-laden clothing and other monitoring technology get connected and sucked into the health data pool, the data is going places the users might not have expected. And they see this as a bit sinister. From the accompanying press release:

Many of these devices are already being integrated into a growing Big Data digital health and marketing ecosystem, which is focused on gathering and monetizing personal and health data in order to influence consumer behavior.”

According to the authors, it’s high time to develop a comprehensive approach to health privacy and consumer protection, given the increasing importance of Big Data and the Internet of Things. If safeguards aren’t put in place, patients could face serious privacy and security risks, including “discrimination and other harms,” according to American University professor Kathryn Montgomery.

If regulators don’t act quickly, they could miss a critical window of opportunity, she suggested. “The connected health system is still in an early, fluid stage of development,” Montgomery said in a prepared statement. “There is an urgent need to build meaningful, effective, and enforceable safeguards into its foundation.”

The researchers also offer guidance for policymakers who are ready to take up this challenge. They include creating clear, enforceable standards for both collection and use of information; formal processes for assessing the benefits and risks of data use; and stronger regulation of direct-to-consumer marketing by pharmas.

Now readers, I imagine some of you are feeling that I’m pointing all of this out to the wrong audience. And yes, there’s little doubt that the researchers are most worried about consumer marketing practices that fall far outside of your scope.

That being said, just because providers have different motives than the pharmas when they collect data – largely to better treat health problems or improve health behavior – doesn’t mean that you aren’t going to make mistakes here. If nothing else, the line between leveraging data to help people and using it to get your way is clearer in theory than in practice.

You may think that you’d never do anything unethical or violate anyone’s privacy, and maybe that’s true, but it doesn’t hurt to consider possible harms that can occur from collecting a massive pool of data. Nobody can afford to get complacent about the downside privacy and security risks involved. Plus, don’t think the nefarious and somewhat nefarious healthcare data aggregators aren’t coming after provider stored health data as well.

Mobile Health App Makers Still Shaky On Privacy Policies

Posted on September 16, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

A new study has concluded that while mobile health app developers are developing better privacy practices, these developers vary widely in how they share those policies with consumers. The research, part of a program launched in 2011 by the Future of Privacy Forum, concludes that while mHealth app makers have improved their practices, too many are still not as clear as they could be with users as to how they handle private health information.

This year’s FPF Mobile App Study notes that mHealth players are working to make privacy policies available to users before purchase or download, by posting links on the app listing page. It probably has helped that the two major mobile health app distribution sites require apps that collect personal info to have a privacy policy in place, but consumer and government pressure has played a role as well, the report said. According to FPF researchers, mHealth app makers are beginning to explain how personal data is collected, used and shared, a step privacy advocates see as the bare minimum standard.

Researchers found that this year, 76% of top overall apps on the iOS App Store and Google Play had a privacy policy, up from 68% noted in the previous iteration of the study. In contrast, only 61% of health and fitness apps surveyed this year included a link to their privacy policies in their app store listing, 10% less than among top apps cutting across all categories.  “Given that some health and fitness apps can access sensitive, physiological data collected by sensors on a mobile phone, wearable, or other device, their below-average performance is both unexpected and troubling,” the report noted.

This disquieting lack of thorough privacy protections extended even to apps collecting some of the most intimate data, the FPF report pointed out. In particular, a subset of mHealth developers aren’t doing anything much to make their policies accessible.

For example, researchers found that while 80% of apps helping women track periods and fertility across Google Play and the iOS App Store had privacy policies, just 63% of the apps had posted links to these policies. In another niche, sleep tracking apps, only 66% of even had a privacy policy in place, and just 54% of these apps linked back to the policy on their store page. (FPF terms this level of performance “dismal,” and it’s hard to disagree.)

Underlying this analysis is the unfortunate truth that there’s still no gold standard for mHealth privacy policies. This may be due more to the complexity of the still-maturing mobile health ecosystem than resistance to creating robust policies, certainly. But either way, this issue won’t go away on its own, so mHealth app developers will need to give their privacy strategy more thought.

NFL Players’ Medical Records Stolen

Posted on June 21, 2016 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I’d been meaning to write about this story for a while now, but finally got around to it. In case you missed it, Thousands of NFL players’ medical records were stolen. Here’s a piece of the DeadSpin summary of the incident:

In late April, the NFL recently informed its players, a Skins athletic trainer’s car was broken into. The thief took a backpack, and inside that backpack was a cache of electronic and paper medical records for thousands of players, including NFL Combine attendees from the last 13 years. That would encompass the vast majority of NFL players

The Redskins later issues this statement:

The Washington Redskins can confirm that a theft occurred mid-morning on April 15 in downtown Indianapolis, where a thief broke through the window of an athletic trainer’s locked car. No social security numbers, Protected Health Information (PHI) under HIPAA, or financial information were stolen or are at risk of exposure.

The laptop was password-protected but unencrypted, but we have no reason to believe the laptop password was compromised. The NFL’s electronic medical records system was not impacted.

It’s interesting that the Redskins said that it didn’t include any PHI that would be covered by HIPAA rules and regulations. I was interested in how HIPAA would apply to an NFL team, so I reached out to David Harlow for the answer. David Harlow, Health Blawg writer, offered these insights into whether NFL records are required to comply with HIPAA or not:

These records fall in a gray zone between employment records and health records. Clearly the NFL understands what’s at stake if, as reported, they’ve proactively reached out to the HIPAA police. At least one federal court is on record in a similar case saying, essentially, C’mon, you know you’re a covered entity; get with the program.

Michael Magrath, current Chairman, HIMSS Identity Management Task Force, and Director of Healthcare Business, VASCO Data Security offered this insight into the breach:

This is a clear example that healthcare breaches are not isolated to healthcare organizations. They apply to employers, including the National Football League. Teams secure and protect their playbooks and need to apply that philosophy to securing their players’ medical information.

Laptop thefts are common place and one of the most common entries (310 incidents) on the HHS’ Office of Civil Rights portal listing Breaches Affecting 500 or More Individuals. Encryption is one of the basic requirements to secure a laptop, yet organizations continue to gamble without it and innocent victims can face a lifetime of identity theft and medical identity theft.

Assuming the laptop was Windows based, security can be enhanced by replacing the static Windows password with two-factor authentication in the form of a one-time password. Without the authenticator to generate the one-time password, gaining entry to the laptop will be extremely difficult. By combining encryption and strong authentication to gain entry into the laptop the players and prospects protected health information would not be at risk, all because organizations and members wish to avoid few moments of inconvenience.

This story brings up some important points. First, healthcare is far from the only industry that has issues with breaches and things like stolen or lost laptops. Second, healthcare isn’t the only one that sees the importance of encrypting mobile devices. However, despite the importance, many organizations still aren’t doing so. Third, HIPAA is an interesting law since it only covers PHI and covered entities. HIPAA omnibus expanded that to business associates. However, there are still a bunch of grey areas that aren’t sure if HIPAA applies. Plus, there are a lot of white areas where your health information is stored and HIPAA doesn’t apply.

Long story short, be smart and encrypt your health data no matter where it’s stored. Be careful where you share your health data. Anyone could be breached and HIPAA will only protect you so much (covered entity or not).

Steps In Integrating Patient-Generated Health Data

Posted on May 24, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

As the number of connected health devices in use has expanded, healthcare leaders have grappled with how to best leverage the data they generate. However, aside from a few largely experimental attempts, few providers are making active use of such data.

Part of the reason is that the connected health market is still maturing. With health tracking wearables, remote monitoring set-ups, mobile apps and more joining the chorus, it might be too soon to try and normalize all this data, much less harvest it for clinical use. Also, few healthcare organizations seem to have a mature strategy in place for digital health.

But technical issues may be the least of our problems. It’s important to note that providers have serious concerns around patient-generated health data (PGHD), ranging from questions about its validity to fears that such data will overwhelm them.

However, it’s possible to calm these fears, argues Christina Caraballo, senior healthcare strategist at Get Real Health.  Here’s her list of the top five concerns she’s heard from providers, with responses that may help put providers at ease:

  • Fear they’ll miss something in the flood of data. Add disclaimers, consent forms, video clips or easy-to-digest graphics clarifying what consumers can and can’t expect, explicitly limiting provider liability.
  • Worries over data privacy and security: Give consumers back some of the risk, by emphasizing that no medium is perfectly secure, including paper health records, and that they must determine whether the benefits of using digital health devices outweigh the risks.
  • Questions about data integrity and standardization: Emphasize that while the industry has made great process and standardization, interoperability, authentication, data provenance, reliability, validity, clinical value and even workflow, the bottom line is that the data still comes from patients, who don’t always report everything regardless of how you collect the data.
  • Concerns about impact on workflow: Underscore that if the data is presented in the right framework, it will be digestible in much the same way as other electronic medical data.
  • Resistance to pressure from consumers: Don’t demand that providers leverage PGHD out of the gate; instead, move incrementally into the PGHD management by letting patients collect data electronically, and then incorporate data into clinical systems once all stakeholders are on board.

Now, I’m not totally uncritical of Ms. Caraballo’s article. In particular, I take issue with her assertion that providers who balk at using PGHD are “naysayers” who “simply don’t want to change.” While there are always a few folks fitting this description in any profession, the concerns she outlines aren’t trivial, and brushing them off with vague reassurances won’t work.

Truthfully, if I were a provider I doubt I would be comfortable relying on PGHD, especially biometric data. As Ingrid Oakley-Girvan of Medable notes, wearables giant Fitbit was hit with a lawsuit earlier this year alleging that its heart rate monitoring technology is inaccurate, and I wouldn’t be surprised other such suits arise. Digital health trackers and apps have transitioned from novelty to quasi-official medical device very quickly — some might say too quickly – and being cautious about their output just makes sense.

Nonetheless, PGHD will play a role in patient care and management at some point in the future, and it makes sense to keep providers in the loop as these technologies progress. But rushing them into using such data would not be wise. Let’s make sure such technologies are vetted before they assume a routine role in care.

Are Ransomware Attacks A HIPAA Issue, Or Just Our Fault?

Posted on April 18, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

With ransomware attacks hitting hospitals in growing numbers, it’s growing more urgent for healthcare organizations to have a routine and effective response to such attacks. While over the short term, providers are focused mostly on survival, eventually they’ll have to consider big-picture implications — and one of the biggest is whether a ransomware intrusion can be called a “breach” under federal law.

As readers know, providers must report any sizable breach to the HHS Office for Civil Rights. So far, though, it seems that the feds haven’t issued any guidance as to how they see this issue. However, people in the know have been talking about this, and here’s what they have to say.

David Holtzman, a former OCR official who now serves as vice president of compliance strategies at security firm CynergisTek, told Health Data Management that as long as the data was never compromised, a provider may be in the clear. If an organization can show OCR proof that no data was accessed, it may be able to avoid having the incident classed as a breach.

And some legal experts agree. Attorney David Harlow, who focuses on healthcare issues, told Forbes: “We need to remember that HIPAA is narrowly drawn and data breaches defined as the unauthorized ‘access, acquisition, use or disclosure’ of PHI. [And] in many cases, ransomware “wraps” PHI rather than breaches it.”

But as I see it, ransomware attacks should give health IT security pros pause even if they don’t have to report a breach to the federal government. After all, as Holtzman notes, the HIPAA security rule requires that providers put appropriate safeguards in place to ensure the confidentiality, the integrity and availability of ePHI. And fairly or not, any form of malware intrusion that succeeds raises questions about providers’ security policies and approaches.

What’s more, ransomware attacks may point to underlying weaknesses in the organization’s overall systems architecture. “Why is the operating system allowing this application to access this data?” asked one reader in comments on a related EMR and HIPAA post. “There should be no possible way for a database that is only read/write for specified applications to be modified by a foreign encryption application,” the reader noted. “The database should refuse the instruction, the OS should deny access, and the security system should lock the encryption application out.”

To be fair, not all intrusions are someone’s “fault.” Ransomware creators are innovating rapidly, and are arguably equipped to find new vectors of infection more quickly than security experts can track them. In fact, easy-to-deploy ransomware as a service is emerging, making it comparatively simple for less-skilled criminals to use. And they have a substantial incentive to do so. According to one report, one particularly sophisticated ransomware strain has brought $325 million in profits to groups deploying it.

Besides, downloading actual data is so five years ago. If you’re attacking a provider, extorting payment through ransomware is much easier than attempting to resell stolen healthcare data. Why go to all that trouble when you can get your cash up front?

Still, the reality is that healthcare organizations must be particularly careful when it comes to protecting patient privacy, both for ethical and regulatory reasons. Perhaps ransomware will be the jolt that pushes lagging players to step up and invest in security, as it creates a unique form of havoc that could easily put patient care at risk. I certainly hope so.