Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

Thoughts on Privacy in Health Care in the Wake of Facebook Scrutiny

Posted on April 13, 2018 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

A lot of health IT experts are taking a fresh look at the field’s (abysmal) record in protecting patient data, following the shocking Cambridge Analytica revelations that cast a new and disturbing light on privacy practices in the computer field. Both Facebook and others in the computer field who would love to emulate its financial success are trying to look at general lessons that go beyond the oddities of the Cambridge Analytica mess. (Among other things, the mess involved a loose Facebook sharing policy that was tightened up a couple years ago, and a purported “academic researcher” who apparently violated Facebook’s terms of service.)

I will devote this article to four lessons from the Facebook scandal that apply especially to health care data–or more correctly, four ways in which Cambridge Analytica reinforces principles that privacy advocates have known for years. Everybody recognizes that the risks modern data sharing practices pose to public life are hard, even intractable, and I will have to content myself with helping to define the issues, not present solutions. The lessons are:

  • There is no such thing as health data.

  • Consent is a meaningless concept.

  • The risks of disclosure go beyond individuals to affect the whole population.

  • Discrimination doesn’t have to be explicit or conscious.

The article will now lay out each concept, how the Facebook events reinforce it, and what it means for health care.

There is no such thing as health data

To be more precise, I should say that there is no hard-and-fast distinction between health data, financial data, voting data, consumer data, or any other category you choose to define. Health care providers are enjoined by HIPAA and other laws to fiercely protect information about diagnoses, medications, and other aspects of their patients’ lives. But a Facebook posting or a receipt from the supermarket can disclose that a person has a certain condition. The compute-intensive analytics that data brokers, marketers, and insurers apply with ever-growing sophistication are aimed at revealing these things. If the greatest impact on your life is that a pop-up ad for some product appears on your browser, count yourself lucky. You don’t know what else someone is doing with the information.

I feel a bit of sympathy for Facebook’s management, because few people anticipated that routine postings could identify ripe targets for fake news and inflammatory political messaging (except for the brilliant operatives who did that messaging). On the other hand, neither Facebook nor the US government acted fast enough to shut down the behavior and tell the public about it, once it was discovered.

HIPAA itself is notoriously limited. If someone can escape being classified as a health care provider or a provider’s business associate, they can collect data with abandon and do whatever they like (except in places such as the European Union, where laws hopefully require them to use the data for the purpose they cited while collecting it). App developers consciously strive to define their products in such a way that they sidestep the dreaded HIPAA coverage. (I won’t even go into the weaknesses of HIPAA and subsequent laws, which fail to take modern data analysis into account.)

Consent is a meaningless concept

Even the European Union’s new regulations (the much-publicized General Data Protection Regulation or GDPR) allows data collection to proceed after user consent. Of course, data must be collected for many purposes, such as payment and shipping at retail web sites. And the GDPR–following a long-established principle of consumer rights–requires further consent if the site collecting the data wants to use it beyond its original purpose. But it’s hard to imagine what use data will be put to, especially a couple years in the future.

Privacy advocates have known from the beginning of the ubiquitous “terms of service” that few people read before the press the Accept button. And this is a rational ignorance. Even if you read the tiresome and legalistic terms of service (I always do), you are unlikely to understand their implications. So the problem lies deeper than tedious verbiage: even the most sophisticated user cannot predict what’s going to happen to the data she consented to share.

The health care field has advanced farther than most by installing legal and regulatory barriers to sharing. We could do even better by storing all health data in a Personal Health Record (PHR) for each individual instead of at the various doctors, pharmacies, and other institutions where it can be used for dubious purposes. But all use requires consent, and consent is always on shaky grounds. There is also a risk (although I think it is exaggerated) that patients can be re-identified from de-identified data. But both data sharing and the uses of data must be more strictly regulated.

The risks of disclosure go beyond individuals to affect the whole population

The illusion that an individual can offer informed consent is matched by an even more dangerous illusion that the harm caused by a breach is limited to the individual affected, or even to his family. In fact, data collected legally and pervasively is used daily to make decisions about demographic groups, as I explained back in 1998. Democracy itself took a bullet when Russian political agents used data to influence the British EU referendum and the US presidential election.

Thus, privacy is not the concern of individuals making supposedly rational decisions about how much to protect their own data. It is a social issue, requiring a coordinated regulatory response.

Discrimination doesn’t have to be explicit or conscious

We have seen that data can be used to draw virtual red lines around entire groups of people. Data analytics, unless strictly monitored, reproduce society’s prejudices in software. This has a particular meaning in health care.

Discrimination against many demographic groups (African-Americans, immigrants, LGBTQ people) has been repeatedly documented. Very few doctors would consciously aver that they wish people harm in these groups, or even that they dismiss their concerns. Yet it happens over and over. The same unconscious or systemic discrimination will affect analytics and the application of its findings in health care.

A final dilemma

Much has been made of Facebook’s policy of collecting data about “friends of friends,” which draws a wide circle around the person giving consent and infringes on the privacy of people who never consented. Facebook did end the practice that allowed Global Science Research to collect data on an estimated 87 million people. But the dilemma behind the “friends of friends” policy is how inextricably it embodies the premise behind social media.

Lots of people like to condemn today’s web sites (not just social media, but news sites and many others–even health sites) for collecting data for marketing purposes. But as I understand it, the “friends of friends” phenomenon lies deeper. Finding connections and building weak networks out of extended relationships is the underpinning of social networking. It’s not just how networks such as Facebook can display to you the names of people they think you should connect with. It underlies everything about bringing you in contact with information about people you care about, or might care about. Take away “friends of friends” and you take away social networking, which has been the most powerful force for connecting people around mutual interests the world has ever developed.

The health care field is currently struggling with a similar demonic trade-off. We desperately hope to cut costs and tame chronic illness through data collection. The more data we scoop up and the more zealously we subject it to analysis, the more we can draw useful conclusions that create better care. But bad actors can use the same techniques to deny insurance, withhold needed care, or exploit trusting patients and sell them bogus treatments. The ethics of data analysis and data sharing in health care require an open, and open-eyed, debate before we go further.

Access to Encrypted iPhones – The Apple Encryption Debate

Posted on February 19, 2016 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

The tech world is in a frenzy over the letter Apple’s CEO Tim Cook sent to the FBI in response to a request for Apple to create essentially a backdoor to be able to access the San Bernardino terrorists iPhone. It’s a messy and a complex situation which puts government against industry and privacy advocates against security advocates. Tim Cook in his letter is right that “this moment calls for public discussion.”

My favorite venture capitalist blogger, Fred Wilson, summed it up best for me when he said this in response to Tim Cook’s assertion that the contents of your iPhone are none of Apple’s business:

That is not an open and shut case to me.

Of course I’d like the contents of my iPhone to be out of reach of everyone other than me. But if that means the contents of the iPhones of child pornographers, sex slaverunners, narco gangsters, terrorists, and a host of other bad people are “none of our business” then that gives me pause.

I don’t think we can have it both ways. We have to choose one way or the other.

I think this is also complicated by the fact that Apple had unlocked phones previously. Albert Wenger expresses my fears around this subject:

We cannot and should not be living in digital fortresses any more than we are living in physical fortresses at home. Our homes are safe from thieves and from government not because they couldn’t get in if they wanted to but because the law and its enforcement prevents them from doing so. All we have to do is minimal physical security (lock the doors when you are out).

Please repeat after me: Surveillance is a political and legal problem, not a technical problem.

This quote is particularly interesting to me since this weekend when my family and I were away on a trip for President’s Day weekend, someone broke into our house (Side Note: We’re all fine and they realized once they got in that we didn’t have anything valuable to take. We mostly just had to deal with a broken door).

I feel similar to my favorite VC who said “I am struggling with this issue this morning, and I imagine many others are too.”

Turning to the healthcare perspective, privacy and security of health information is so important. It’s literally the intimate details of your life. I’ve heard some argue that Apple creating a way for the FBI to access this one phone would mean that all of our health information on iPhones would be at great risk of being compromised. I think that’s an exaggeration of what’s happening, but I understand the slippery slope argument.

What’s interesting is that none of us want our healthcare data to be compromised. However, if we were in a coma and the life saving information was on our iPhone, we’d love for someone to have a way to access that information. I’ve seen startup companies who’ve built that ability into the iPhone home screen for just this purpose.

I guess I’m torn on the issue. Privacy is important, but so is security. This weekend I’m going to be chewing on “We cannot and should not be living in digital fortresses any more than we are living in physical fortresses at home.” The problem with this concept is that fortresses are something we can plan and build. The other solutions are much more complex.

Could the Drive to Value-Based Healthcare Undermine Security?

Posted on November 27, 2015 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

As we all know, the healthcare industry’s move toward value-based healthcare is forcing providers to make some big changes. In fact, a recent report by peer60 found that 64% of hospitals responding cited oncoming value-based reimbursement as their top challenge. Meanwhile, only 30% could say the same of improving information security according to peer60, which recently surveyed 320 hospital leaders.

Now, the difference in concern over the two issues can be chalked up, at least in part, to the design of the survey. Obviously, there’s a good chance that a survey of CIOs would generate different results. But as the report’s authors noted, the survey might also have exposed a troublesome gap in priorities between health IT and the rest of the hospital C-suite.

It’s hardly surprising hospital leaders are focused on the life-and-death effects of a major change in payment policy. Ultimately, if a hospital can’t stay in business, protecting data won’t be an issue anymore. But if a hospital keeps its doors open, protecting patient data must be given a great deal of attention.

If there is a substantial gap between CIOs and their colleagues on security, my guess is that the reasons include the following:

  • Assuming CIOs can handle things:  Lamentable though it may be, less-savvy healthcare leaders may think of security as a tech-heavy problem that doesn’t concern them on a day-to-day level.
  • Managing by emergency:  Though they might not admit it publicly, reactive health executives may see security problems as only worth addressing when something needs fixing.
  • Fear of knowing what needs to be done:  Any intelligent, educated health exec knows that they can’t afford to let security be compromised, but they don’t want to face up to the time, money and energy it takes to do infosec right.
  • Overconfidence in existing security measures:  After approving the investment of tens or even hundreds of millions on health IT, non-tech health leaders may find it hard to believe that perfect security isn’t “built in” and complete.

I guess the upshot of all of this is that even sophisticated healthcare executives may have dysfunctional beliefs about health data security. And it’s not surprising that health leaders with limited technical backgrounds may prefer to attack problems they do understand.

Ultimately, this suggests to me that CIOs and other HIT leaders still have a lot of ‘splaining to do. To do their best with security challenges, health IT execs need the support from the entire leadership team, and that will mean educating their peers on some painful realities of the trade.

After all, if security is to be an organization-wide process — not just a few patches and HIPAA training sessions — it has to be ingrained in everything employees do. And that may mean some vigorous exchanges of views on how security fosters value.

The Shifting Health Care IT Markets

Posted on November 5, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I’m at the end of my Fall Healthcare IT Conference season (although I’m still considering attending RSNA for my first time) and besides being thankful to be done with all the travel, I’m also taking a second to think about what I’ve learned over the past couple months as I’ve traveled to a wide variety of conferences.

While the EHR market has been hot for so many years, I’m seeing a big shift in purchasing to three areas: Analytics/Population Health, Revenue Cycle Management, and Privacy/Security. This isn’t a big surprise, but the EHR market has basically matured and now even EHR vendors are looking at new ways to market their products. These are the three main areas where I see the market evolving.

Analytics and Population Health
I could have easily added the other buzzword “patient engagement” to this category as well. There’s a whole mixture of technologies and approaches for this category of healthcare IT. In fact, it’s where I see some of the most exciting innovations in healthcare. Most of it is driven by some form of value based reimbursement or organizations efforts to prepare for the shift to value based reimbursement. However, there’s also a great interest by many organizations to try and extract value from their EHR investment. Many are betting on these tools being able to help them realize value from their EHR data.

Revenue Cycle Management
We’re seeing a whole suite of revenue cycle solutions. For many years we’ve seen solutions that optimized an organization’s relationships with payers. Those are still popular since it seems like most organizations never really fix the problem so their need for revenue cycle management is cyclical. Along with these payer solutions, we’re seeing a whole suite of products and companies that are focused on patient payment solutions. This shift has been riding the wave of high deductible plans in healthcare. As an organization’s patient pay increases, they’re looking for better ways to collect the patient portion of the bill.

Privacy and Security
There have been so many health care breaches, it’s hard to even keep up. Are we becoming numb to them? Maybe, but I still see many organizations investing in various privacy and security programs and tools whenever they hear about another breach. Plus, the meaningful use requirement to do a HIPAA Risk Assessment has built an entire industry focused on those risk assessments. You can be sure the coming HIPAA audits will accelerate those businesses even more.

What other areas are you seeing become popular in health care IT?

Dishonesty Ruins So Many Things

Posted on September 5, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I’m always struck by this simple concept: Dishonesty make so many things more difficult than they should be.

We see this all over healthcare. Look for example at patient privacy and security. If people were just honest and thoughtful with patient data, our privacy and security challenges would be so much simpler. Imagine how much time and heartache we’d save if people were just honest when it comes to privacy and security. Yes, I’m looking at the million of hackers that are trying to take people’s personal information. Imagine if we could focus all the money and time we spend securing applications and apply it to improving healthcare. What a difference that would make.

The same could be said for reimbursement. Our reimbursement system would look drastically different if people were just honest. Yes, I’m talking about the billions of dollars of Medicare and other insurance fraud that’s out there. What a sad expense on our current healthcare system as dishonest people try and make a quick buck. While that expense is large, the even larger cost to our healthcare system is the toll that fraud adds to the honest actors.

Look at our current model of reimbursement for healthcare. So much of our insane documentation efforts are tied to the fact that insurance companies are trying to combat fraud. They don’t and can’t trust providers billing levels and so they’ve created layer and layer of requirements that makes the healthcare documentation process miserable. If you don’t agree with me, then you aren’t someone that’s involved in healthcare reimbursement.

This expense gets passed on to the employer and patients as well. Have you ever tried to make sense of the bill or statement of benefits coming from your doctor or insurance company? It’s like trying to make sense of a new language. It doesn’t make sense since you as a patient don’t know that language. Are they screwing you over in what they’re billing you or not? You don’t know either way and good luck trying to find out the answer. The person on the other end of the phone likely isn’t sure either because it’s so complex.

I first learned this principle in the credit card world. Why on earth do we pay 3+% of every transaction we do on our credit card. The answer is simple. Credit card fraud (otherwise known as dishonesty) is rampant and why credit card transactions cost so much. Imagine a world where the doctor wasn’t giving 3% of their business to process a credit card transaction since the cost to change digital digits should be nothing.

Unfortunately, the reality is we do live in a world with a lot of dishonest people who try and game anything and everything. We have to pay attention to security and privacy with these dishonest people in mind. We have to deal with insane reimbursement requirements as these payers try and combat fraud. We have to deal with credit card fraud and pay for it in the process.

It’s unfortunate, because dishonesty almost always catches up with people. Even when we think it doesn’t, dishonesty pays its own toll on a person as they can never be comfortable. Having a clear, honest conscious is one of the most beautiful things in life.

Going Beyond EHR Data Collection to EHR Data Use with Dr. Dan Riskin

Posted on December 5, 2013 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

We had a chance to sit down and do a Google Plus hangout with Dan Riskin, MD, CEO and co-founder of Health Fidelity to discuss the challenges of EHR today and how we can reach the real benefits of EHR adoption. We had a great discussion about how the industry is so caught up just getting the data in the EHR software that we’re missing out on the opportunity to get the benefits of actually using the EHR data.

For some reason the Google hangout audio and video didn’t sink right (welcome to the cutting edge of technology), but the audio is good. Just start up the video below and enjoy listening to it like a podcast or radio show. I expect that’s what most of you do anyway with our videos.

I hope you’ll enjoy my interview with Dr. Riskin.

Fitbit Privacy or Lack Thereof – Exposing Sexual Activity of Its Users

Posted on September 13, 2011 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Well, privacy rears its ugly head in healthcare again. I don’t want to treat a person’s privacy lightly, but I must admit that I kind of had to laugh at the breach I’m about to tell you about. I think you’ll see why.

I first read about this privacy breach on this Techcrunch article (They originally found it on nextWeb). Here’s a quote from the Techcrunch article:

Yikes. Users of fitness and calorie tracker Fitbit may need to be more careful when creating a profile on the site. The sexual activity of many of the users of the company’s tracker and online platform can be found in Google Search results, meaning that these users’ profiles are public and searchable.

I’ve been a big fan of Fitbit and other devices like that which are trying to track a person’s health and fitness. I think there’s a real market for these devices, but this is a pretty ugly misstep for Fitbit. Although, a search for sexual activity and FitBit isn’t returning results any more. Here’s the Fitbit blog post which details the steps they’ve taken to secure their users profiles. Seems like a reasonable and a smart response to the privacy issue.

Before I go any farther, we should be clear that this isn’t a HIPAA violation. The patient put their information online and agreed to have that information out there. We could argue how much they really agreed to have their profile public, but I’m quite sure that Fitbit would be fine in a HIPAA lawsuit. However, that doesn’t mean they’re not taking the hit for poor decisions.

What can future healthcare app and device companies learn from the Privacy issues at Fitbit?

1. Default healthcare profiles to private. Allow the user to opt in to make it public. Some might want it public, but no company should assume it should be public. This isn’t Facebook.

2. Consider more granular privacy controls. I may want part of my profile public, but part private (ie. sexual activity in a fitness application).

3. Be aware of what you allow search engines to index. There’s a whole category of hackers called Google Hackers. They use Google to find sensitive information like the story above. It’s amazing the power of Google hacking.

Some suggestions to e-patients that put their health data online:

1. Be careful about what information you’re putting online.

2. Check out where the information you put online will be available. Is it private? Is it public? Is it partially public? Can search engines see it?

There’s little doubt that more and more healthcare information is going to be put online by patients. We’re going to see more and more privacy issues like the one mentioned above. This incident will do little to deter this trend. However, hopefully it can serve as a learning experience for Fitbit and other healthcare companies that are entering this new world of online health information.