Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

Exchange Value: A Review of Our Bodies, Our Data by Adam Tanner (Part 3 of 3)

Posted on January 27, 2017 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

The previous part of this article raised the question of whether data brokering in health care is responsible for raising or lower costs. My argument that it increases costs looks at three common targets for marketing:

  • Patients, who are targeted by clinicians for treatments they may not need or have thought of

  • Doctors, who are directed by pharma companies toward expensive drugs that might not pay off in effectiveness

  • Payers, who pay more for diagnoses and procedures because analytics help doctors maximize charges

Tanner flags the pharma industry for selling drugs that perform no better than cheaper alternatives (Chapter 13, page 146), and even drugs that are barely effective at all despite having undergone clinical trials. Anyway, Tanner cites Hong Kong and Europe as places far more protective of personal data than the United States (Chapter 14, page 152), and they don’t suffer higher health care costs–quite the contrary.

Strangely, there is no real evidence so far that data sales have produced either harm to patients or treatment breakthroughs (Conclusion, 163). But the supermarket analogy does open up the possibility that patients could be induced to share anonymized data voluntarily by being reimbursed for it (Chapter 14, page 157). I have heard this idea aired many times, and it fits with the larger movement called Vendor Relationship Management. The problem with such ideas is the close horizon limiting our vision in a fast-moving technological world. People can probably understand and agree to share data for particular research projects, with or without financial reimbursement. But many researchers keep data for decades and recombine it with other data sets for unanticipated projects. If patients are to sign open-ended, long-term agreements, how can they judge the potential benefits and potential risks of releasing their data?

Data for sale, but not for treatment

In Chapter 11, Tanner takes up the perennial question of patient activists: why can drug companies get detailed reports on patient conditions and medications, but my specialist has to repeat a test on me because she can’t get my records from the doctor who referred me to her? Tanner mercifully shields here from the technical arguments behind this question–sparing us, for instance, a detailed discussion of vagaries in HL7 specifications or workflow issues in the use of Health Information Exchanges–but strongly suggests that the problem lies with the motivations of health care providers, not with technical interoperability.

And this makes sense. Doctors do not have to engage in explicit “blocking” (a slippery term) to keep data away from fellow practitioners. For a long time they were used to just saying “no” to requests for data, even after that was made illegal by HIPAA. But their obstruction is facilitated by vendors equally uninterested in data exchange. Here Tanner discards his usual pugilistic journalism and gives Judy Faulkner an easy time of it (perhaps because she was a rare CEO polite enough to talk to him, and also because she expressed an ethical aversion to sharing patient data) and doesn’t air such facts as the incompatibilities between different Epic installations, Epic’s tendency to exchange records only with other Epic installations, and the difficulties it introduces toward companies that want to interconnect.

Tanner does not address a revolution in data storage that many patient advocates have called for, which would at one stroke address both the Chapter 11 problem of patient access to data and the book’s larger critique of data selling: storing the data at a site controlled by the patient. If the patient determined who got access to data, she would simply open it to each new specialist or team she encounters. She could also grant access to researchers and even, if she chooses, to marketers.

What we can learn from Chapter 9 (although Tanner does not tell us this) is that health care organizations are poorly prepared to protect data. In this woeful weakness they are just like TJX (owner of the T.J. Maxx stores), major financial institutions, and the Democratic National Committee. All of these leading institutions have suffered breaches enabled by weak computer security. Patients and doctors may feel reluctant to put data online in the current environment of vulnerability, but there is nothing special about the health care field that makes it more vulnerable than other institutions. Here again, storing the data with the individual patient may break it into smaller components and therefore make it harder for attackers to find.

Patient health records present new challenges, but the technology is in place and the industry can develop consent mechanisms to smooth out the processes for data exchange. Furthermore, some data will still remain with the labs and pharmacies that have to collect it for financial reasons, and the Supreme Court has given them the right to market that data.

So we are left with ambiguities throughout the area of health data collection. There are few clear paths forward and many trade-offs to make. In this I agree ultimately with Tanner. He said that his book was meant to open a discussion. Among many of us, the discussion has already started, and Tanner provides valuable input.

Are We Waiting For An Interoperability Miracle?

Posted on December 12, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Today, in reading over some industry news, my eyes settled on an advertising headline that gave me pause: “Is Middleware The Next Interoperability Miracle?”  Now, I have to admit a couple things: 1) that vendors have to pitch the latest white paper with all the fervor they can command, and 2) that it never hurts to provoke conversation with a strong assertion. But seeing a professional advertisement include the word “miracle” — an expostulatory term which you might use to sell dishwashers — still took me back a bit.

And then I began to think about what I had seen. I wondered whether it will really take a miracle to achieve health data interoperability sometime in our lifetime. I asked myself whether health IT insiders like you, dear readers, are actually that discouraged. And I wondered if any vendor truly believes that they can produce such a miracle, if indeed one is needed.

First, let’s ask ourselves about whether we need a Hail Mary pass or even a miracle to salvage industry hopes for data interoperability. I’m afraid that in my view, the answer is quite possibly yes. In saying this, I’m assuming that interoperability must arrive soon to meet our current needs, at least within the next several years.

Unfortunately, nothing I’ve seen suggests that we can realistically achieve robust interoperability within the next say, 5 to 10 years, despite all appearances to the contrary. I know some readers may disagree with me, but as I see it the combination of technical and behavioral obstacles to interoperability are just too profound to be addressed in a timely manner.

Okay, then, on to whether health IT rank and file are so burned out on interoperability efforts that they just want the problem taken off of their hands. If they did, I would certainly sympathize, as the forces in play here are beyond the control of any individual IT staffer, consultant, hospital or health system. The forces holding back interoperability are interwoven with technical, financial, policy and operational issues which can’t be addressed without a high level of cooperation between competing entities — and perhaps not even then.

So, back to where we started. Headline aside, does the vendor in question or any other truly believe that they can engineer a solution to such an intractable problem, conquer world interoperability issues and grow richer than Scrooge McDuck? Probably not. Interoperability is a set of behaviors as much as a technology, and I doubt even the cockiest startup thinks it can capture that many hearts and minds.

Ultimately, though, whoever wrote that headline is probably keying into something real. While the people doing the hard work of attempting health data sharing aren’t exactly desperate, I think there’s a growing sense that we’re running out of time to get this thing done. Obviously, other than artificial ones imposed by laws and regulations, we aren’t facing any actual deadline, but things can’t go on like this forever.

In fact, I’d argue that if we don’t create a useful interoperability model soon, a window of opportunity for doing so will be lost for quite some time. After all, we can’t keep spending on this infrastructure if it’s never going to offer a payback.

The cold reality is that eventually, the data sharing system we have — such as it is — will fall apart of its own weight, as organizations simply stop paying for their part of it. So while we might not need a miracle as such, being granted one wouldn’t hurt. If this effort fails us, who knows when we’ll have the time and money to try again.

Sansoro Hopes Its Health Record API Will Unite Them All

Posted on June 20, 2016 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

After some seven years of watching the US government push interoperability among health records, and hearing how far we are from achieving it, I assumed that fundamental divergences among electronic health records at different sites posed problems of staggering complexity. I pricked up my ears, therefore, when John Orosco, CTO of Sansoro Health, said that they could get EHRs to expose real-time web services in a few hours, or at most a couple days.

What does Sansoro do? Its goal, like the FHIR standard, is to give health care providers and third-party developers a single go-to API where they can run their apps on any supported EHR. Done right, this service cuts down development costs and saves the developers from having to distribute a different version of their app for different customers. Note that the SMART project tries to achieve a similar goal by providing an API layer on top of EHRs for producing user interfaces, whereas Sansoro offers an API at a lower level on particular data items, like FHIR.

Sansoro was formed in the summer of 2014. Researching EHRs, its founders recognized that even though the vendors differed in many superficial ways (including the purportedly standard CCDs they create), all EHRs dealt at bottom with the same fields. Diagnoses, lab orders, allergies, medications, etc. are the same throughout the industry, so familiar items turn up under the varying semantics.

FHIR was just starting at that time, and is still maturing. Therefore, while planning to support FHIR as it becomes ready, Sansoro designed their own data model and API to meet industry’s needs right now. They are gradually adding FHIR interfaces that they consider mature to their Emissary application.

Sansoro aimed first at the acute care market, and is expanding to support ambulatory EHR platforms. At the beginning, based on market share, Sansoro chose to focus on the Cerner and Epic EHRs, both of which offer limited web services modules to their customers. Then, listening to customer needs, Sansoro added MEDITECH and Allscripts; it will continue to follow customer priorities.

Although Orosco acknowledged that EHR vendors are already moving toward interoperability, their services are currently limited and focus on their own platforms. For various reasons, they may implement the FHIR specification differently. (Health IT experts hope that Argonaut project will ensure semantic interoperability for at least the most common FHIR items.) Sansoro, in contrast can expose any field in the EHR using its APIs, thus serving the health care community’s immediate needs in an EHR-agnostic manner. Emissary may prevent the field from ending up again the way the CCD has fared, where each vendor can implement a different API and claim to be compliant.

This kind of fragmented interface is a constant risk in markets in which proprietary companies are rapidly entering an competing. There is also a risk, therefore, that many competitors will enter the API market as Sansoro has done, reproducing the minor and annoying differences between EHR vendors at a higher level.

But Orosco reminded me that Google, Facebook, and Microsoft all have competing APIs for messaging, identity management, and other services. The benefits of competition, even when people have to use different interfaces, drives a field forward, and the same can happen in healthcare. Two directions face us: to allow rapid entry of multiple vendors and learn from experience, or to spend a long time trying to develop a robust standard in an open manner for all to use. Luckily, given Sansoro and FHIR, we have both options.

Mobile PHRs On The Way — Slowly

Posted on October 24, 2013 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

On-demand mobile PHRs are likely to emerge over time, but not until the healthcare industry does something to mend its interoperability problems, according to a new report from research firm Frost & Sullivan.

As the paper notes, mobile application development is moving at a brisk clip, driven by consumer and governmental demands for better quality care, lower healthcare costs and improved access to information.

The problem is, it’s hard to create mobile products — especially a mobile PHR — when the various sectors of the healthcare industry don’t share data effectively.  According to Frost  & Sullivan, it will be necessary to connect up providers, hospitals, physician specialty groups, imaging centers, laboratories, payers and government entities, each of which have operated within their own informational silos and deployed their own unique infrastructures.

The healthcare industry will also need to resolve still-undecided questions as to who owns patient information, Frost & Sullivan suggests.  As things stand, “the patient does not own his or her health information, as this data is stored within the IT  protocols of the EHR system,  proprietary to providers, hospitals and health systems,” said Frost & Sullivan Connected Health Senior Industry Analyst Patrick Riley in a press statement.

While patient ownership of medical data sounds like a problem worth addressing, the industry hasn’t shown the will to address it.  To date, efforts to address the issue of who owns digital files has been met with a “tepid” response, the release notes.

However, it’s clear that outside vendors can solve the problem if they see a need. For example, consider the recent deal in which Allscripts agreed to supply clinical data to health plans.  Allscripts plans to funnel data from participating users of its ambulatory EMR to vendor Inovalon, which aggregates claims, lab, pharmacy, durable medical equipment, functional status and patient demographics for payers. Providers are getting patient-level analyses of the data in return for their participation.

Deals like this one suggest that rather than wait for interoperability, bringing together the data for a robust mobile PHR should be done by a third  party. Which party, what it will it cost to work with them and how the data collection would work are the least of the big problems that would have to be solved — but might be that or nothing for the foreseeable future.

Great Response to Blumenthal Interview

Posted on February 1, 2011 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

The other day I came across an interview with David Blumenthal. I didn’t find anything all that meaningful in the interview itself. However, in the comments, someone provided some really interesting commentary on what Blumenthal said in the interview.

Dr. Blumenthal says we need operability before we move to interoperability. Yet if you don’t design your systems from the start to interoperate, you’ll inevitably wind up with operable systems that do not interoperate – at all. Having accomplished this, we’ll then have to develop and impose an after-the-fact standard to which all systems must comply. This will mean redesign, retrofit, and plastering all kinds of middleware layers between disparate systems. It may even result in retraining tomorrow all those providers you hope will learn new ways of working today.

Dr. Blumenthal also says that new and better technology is coming out every day. Yet the current incentive and certification programs heavily favor the older technology which he himself says frightens many providers away from this migration. Many of the older vendors have huge installed bases and old technology. They no doubt influence advisory boards much to lean towards what is versus what might be, all assurances to the contrary.

The cost of fixing practically anything is much higher than doing it correctly the first time. I realize you can’t design perfection, and anything we build will need adaptation and improvement. But we’re following a path that ensures that we will have to do much more fixing than we would if we’d just stop and think a bit more.

The inevitability of this evolution is not a justifcation for doing it carelessly.

Talk about bringing up some valid issues. The second one really hits me that the incentive money favors older technology. I’m afraid this is very much the case and that 5 years from now the major topic we’re covering on EMR and HIPAA is switching EMRs.