Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

Direct, Sequoia Interoperability Projects Continue To Grow

Posted on May 15, 2017 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

While its fate may still be uncertain – as with any interoperability approach in this day and age – the Direct exchange network seems to be growing at least. At the same time, it looks like the Sequoia Project’s interoperability efforts, including the Carequality Interoperability Framework and its eHealthExchange Network, are also expanding rapidly.

According to a new announcement from DirectTrust, the number of health information service providers who engaged in Direct exchanges increased 63 percent during the first quarter of 2017, to almost 95,000, over the same period in 2016.  And, to put this growth in perspective, there were just 5,627 providers involved in Q1 of 2014.

Meanwhile, the number of trusted Direct addresses which could share PHI grew 21 percent, to 1.4 million, as compared with the same quarter of 2016. Again, for perspective, consider that there were only 182,279 such addresses available three years ago.

In addition, the Trust noted, there were 35.6 million Direct exchange transactions during the quarter, up 76 percent over the same period last year. It expects to see transaction levels hit 140 million by the end of this year.

Also, six organizations joined DirectTrust during the first quarter of 2017, including Sutter Health, the Health Record Banking Alliance, Timmaron Group, Moxe Health, Uticorp and Anne Arundel Medical Center. This brings the total number of members to 124.

Of course, DirectTrust isn’t the only interoperability group throwing numbers around. In fact, Seqouia recently issued a statement touting its growth numbers as well (on the same day as the Direct announcement, natch).

On that day, the Project announced that the Carequality Interoperability Framework had been implemented by more than 19,000 clinics, 800 hospitals and 250,000 providers.

It also noted that its eHealth Exchange Network, a healthcare data sharing network, had grown 35 percent over the past year, connecting participants in 65 percent of all US hospitals, 46 regional and state HIEs, 50,000 medical groups, more than 3,400 dialysis centers and 8,300 pharmacies. This links together more than 109, million patients, Sequoia reported.

So what does all of this mean? At the moment, it’s still hard to tell:

  • While Direct and Sequoia are expanding pretty quickly, there’s few phenomena to which we can compare their growth.
  • Carequality and CommonWell agreed late last year to share data across each others’ networks, so comparing their transaction levels to other entities would probably be deceiving.
  • Though the groups’ lists of participating providers may be accurate, many of those providers could be participating in other efforts and therefore be counted multiple times.
  • We still aren’t sure what metrics really matter when it comes to measuring interoperability success. Is it the number of transactions initiated by a provider? The number of data flows received? The number of docs and facilities who do both and/or incorporate the data into their EMR?

As I see it, the real work going forward will be for industry leaders to decide what kind of performance stats actually equate to interoperability success. Otherwise, we may not just be missing health sharing bullseyes, we may be firing at different targets.

The Case For Accidental Interoperability

Posted on December 22, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Many of us who follow #HITsm on Twitter have encountered the estimable standards guru Keith Boone (known there as @motorcycle_guy). Keith always has something interesting to share, and his recent article, on “accidental” interoperability, is no exception.

In his article, he describes an aha moment: “I had a recent experience where I saw a feature one team was demonstrating in which I could actually integrate one of my interop components to supply additional functionality,” he writes. “When you build interop components right, this kind of accidental interop shows up all the time.”

In his piece, he goes on to argue that this should happen a lot more often, because by doing so, “you can create lot of value through it with very little engineering investment.”

In an ideal world, such unplanned instances of interoperability would happen often, allowing developers and engineers to connect solutions with far less trouble and effort. And the more often that happened, the more resources everyone involved would have to invest in solving other types of problems.

But in his experience, it can be tough to get dev teams into the “component-based” mindset that would allow for accidental interoperability. “All too often I’ve been told those more generalized solutions are ‘scope expansions,’ because they don’t fit the use case,” and any talk of future concerns is dropped, he says.

While focusing on a particular use case can save time, as it allows developers to take shortcuts which optimize their work for that use case, this approach also limits the value of their work, he argues. Unfortunately, this intense focus prevents developers from creating more general solutions that might have broader use.

Instead of focusing solely on their short-term goals, he suggests, health IT leaders may want to look at the bigger picture. “My own experience tells me that the value I get out of more general solutions is well worth the additional engineering attention,” he writes. “It may not help THIS use case, but when I can supply the same solution to the next use case that comes along, then I’ve got a clear win.”

Keith’s article points up an obstacle to interoperability that we don’t think much about right now. While most of what I read about interoperability options — including on this blog — focus on creating inter-arching standards that can tie all providers together, we seldom discussed the smaller, day-to-day decisions that stand in the way of health data sharing.

If he’s right (and I have little doubt that he is) health IT interoperability will become a lot more feasible, a lot more quickly, if health organizations take a look at the bigger purposes an individual development project can meet. Otherwise, the next project may just be another silo in the making.

Can Interoperability Drive Value-Based Care?

Posted on December 14, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

As the drive to interoperability has evolved over the last few decades — and those of you who are HIT veterans know that these concerns go at least that far back — open data sharing has gone from being a “nice to have” to a presumed necessity for providing appropriate care.

And along the way, backers of interoperability efforts have expanded their goals. While the need to support coordinated care has always been a basis for the discussion, today the assumption is that value-based care simply isn’t possible without data interoperability between providers.

I don’t disagree with the premise. However, I believe that many providers, health systems and ACOs have significant work to do before they can truly benefit from interoperability. In fact, we may be putting the cart before the horse in this case.

A fragmented system

At present, our health system is straining to meet the demand for care coordination among the populations it serves. That may be in part because the level of chronic illness in the US is particularly high. According to one Health Affairs study, two out of three Americans will have a chronic condition by the year 2030. Add that to the need to care for patients with episodic care needs and the problem becomes staggering.

While some health organizations, particularly integrated systems like the Cleveland Clinic and staff-model managed care plans like Kaiser Permanente, plan for and execute well on care coordination, most others have too many siloes in place to do the job correctly. Though many health systems have installed enterprise EMRs like Epic and Cerner, and share data effectively while the patient remains down in their system, they may do very little to integrate information from community providers, pharmacies, laboratories or diagnostic imaging centers.

I have no doubt that when needed, individual providers collect records from these community organizations. But collecting records on the fly is no substitute for following patients in a comprehensive way.

New models required

Given this history, I’d argue that many health systems simply aren’t ready to take full advantage of freely shared health data today, much less under value-based care payment models of the future.

Before they can use interoperable data effectively, provider organizations will need to integrate outside data into their workflow. They’ll need to put procedures in place on how care coordination works in their environment. This will include not only deciding who integrates of outside data and how, but also how organizations will respond as a whole.

For example, hospitals and clinics will need to figure out who handles care coordination tasks, how many resources to pour into this effort, how this care coordination effort fits into the larger population health strategy and how to measure whether they are succeeding or failing in their care coordination efforts. None of these are trivial tasks, and the questions they raise won’t be answered overnight.

In other words, even if we achieved full interoperability across our health system tomorrow, providers wouldn’t necessarily be able to leverage it right away. In other words, unfettered health data sharing won’t necessarily help providers win at value-based care, at least not right away. In fact, I’d argue that it’s dangerous to act as though interoperability can magically make this happen. Even if full interoperability is necessary, it’s not sufficient. (And of course, even getting there seems like a quixotic goal to some, including myself.)

Planning ahead

That being said, health organizations probably do have time to get their act together on this front. The move to value-based care is happening quickly, but not at light speed, so they do have time to make plans to leverage interoperable health data.

But unless they acknowledge the weaknesses of their current system, which in many cases is myopic, siloed and rigid, interoperability may do little to advance their long-term goals. They’ll have to admit that their current systems are far too inward-looking, and that the problem will only go away if they take responsibility for fixing it.

Otherwise, even full interoperability may do little to advance value-based care. After all, all the data in the world won’t change anything on its own.

Are We Waiting For An Interoperability Miracle?

Posted on December 12, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Today, in reading over some industry news, my eyes settled on an advertising headline that gave me pause: “Is Middleware The Next Interoperability Miracle?”  Now, I have to admit a couple things: 1) that vendors have to pitch the latest white paper with all the fervor they can command, and 2) that it never hurts to provoke conversation with a strong assertion. But seeing a professional advertisement include the word “miracle” — an expostulatory term which you might use to sell dishwashers — still took me back a bit.

And then I began to think about what I had seen. I wondered whether it will really take a miracle to achieve health data interoperability sometime in our lifetime. I asked myself whether health IT insiders like you, dear readers, are actually that discouraged. And I wondered if any vendor truly believes that they can produce such a miracle, if indeed one is needed.

First, let’s ask ourselves about whether we need a Hail Mary pass or even a miracle to salvage industry hopes for data interoperability. I’m afraid that in my view, the answer is quite possibly yes. In saying this, I’m assuming that interoperability must arrive soon to meet our current needs, at least within the next several years.

Unfortunately, nothing I’ve seen suggests that we can realistically achieve robust interoperability within the next say, 5 to 10 years, despite all appearances to the contrary. I know some readers may disagree with me, but as I see it the combination of technical and behavioral obstacles to interoperability are just too profound to be addressed in a timely manner.

Okay, then, on to whether health IT rank and file are so burned out on interoperability efforts that they just want the problem taken off of their hands. If they did, I would certainly sympathize, as the forces in play here are beyond the control of any individual IT staffer, consultant, hospital or health system. The forces holding back interoperability are interwoven with technical, financial, policy and operational issues which can’t be addressed without a high level of cooperation between competing entities — and perhaps not even then.

So, back to where we started. Headline aside, does the vendor in question or any other truly believe that they can engineer a solution to such an intractable problem, conquer world interoperability issues and grow richer than Scrooge McDuck? Probably not. Interoperability is a set of behaviors as much as a technology, and I doubt even the cockiest startup thinks it can capture that many hearts and minds.

Ultimately, though, whoever wrote that headline is probably keying into something real. While the people doing the hard work of attempting health data sharing aren’t exactly desperate, I think there’s a growing sense that we’re running out of time to get this thing done. Obviously, other than artificial ones imposed by laws and regulations, we aren’t facing any actual deadline, but things can’t go on like this forever.

In fact, I’d argue that if we don’t create a useful interoperability model soon, a window of opportunity for doing so will be lost for quite some time. After all, we can’t keep spending on this infrastructure if it’s never going to offer a payback.

The cold reality is that eventually, the data sharing system we have — such as it is — will fall apart of its own weight, as organizations simply stop paying for their part of it. So while we might not need a miracle as such, being granted one wouldn’t hurt. If this effort fails us, who knows when we’ll have the time and money to try again.

ONC Offers Two Interoperability Measures

Posted on July 14, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

For a while now, it’s been unclear how federal regulators would measure whether the U.S. healthcare system was moving toward the “widespread interoperability” MACRA requires. But the wait is over, and after reviewing a bunch of comments, ONC has come through with some proposals that seem fairly reasonable at first glance.

According to a new blog entry from ONC, the agency has gotten almost 100 comments on how to address interoperability. These recommendations, the agency concluded, fell into four broad categories:

  • Don’t create any significant new reporting burdens for providers
  • Broaden the scope of interoperability measurements to include providers and individuals that are not eligible for Medicare and Medicaid EHR incentives
  • Create measures that examine usage and usefulness of exchanged information, as well as the impact on health outcomes, in addition to measuring the exchange itself
  • Recognize that given the complexity of measuring interoperability, it will take multiple data sources, and that more discussions will be necessary to create an effective model for such measurements

In response, ONC has come up with two core measures which address not only the comments, but also its own analysis and MACRA’s specific definitions of “widespread interoperability.”

  • Measure #1: Proportion of healthcare providers electronically engaging in the following core domains of interoperable exchange of health information: sending; receiving; finding (querying); and integrating information received outside sources.
  • Measure #2: Proportion of healthcare providers who report using information electronically received through outside providers and sources for clinical decision-making.

To measure these activities, ONC expects to be able to draw on existing national surveys of hospitals and office-based physicians. These include the American Hospital Association’s AHA Information Technology Supplement Survey and the CDC National Center for Health Statistics’ annual National Electronic Health Record Survey of office-based physicians.

The reasons ONC would like to use these data sources include that they are not limited to Medicare and Medicaid EHR incentive program participants, and that both surveys have relatively high response rates.

I don’t know about you, but I was afraid things would be much worse. Measuring interoperability is quite difficult, given that just about everyone in the healthcare industry seems to have a slightly different take on what true interoperability actually is.

For example, there’s a fairly big gulf between those who feel interoperability only happens when all data flows from provider to provider, and those who feel that sharing a well-defined subset (such as that found in the Continuity of Care Document) would do the trick just fine. There is no way to address both of these models at the same time, much less the thousand shades of gray between the two extremes.

While its measures may not provide the final word on the subject, ONC has done a good job with the problem it was given, creating a model which is likely to be palatable to most of the parties involved. And that’s pretty unusual in the contentious world of health data interoperability. I hope the rollout goes equally well.

The Downside of Interoperability

Posted on May 2, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

It’s hard to argue that achieving health data interoperability is not important — but it comes with risks. And I’ve seen little discussion of the fact that interoperability may actually increase the chance that a major attack could hit a wide swath of healthcare providers. It might be extreme to suggest that we put off such efforts until we step up the industry’s security status, but the problem shouldn’t be ignored either.

Sure, data interoperability is a critical goal for healthcare providers of all stripes. While there’s room to argue about how it should be accomplished, particularly over whether providers or patients should drive health data management, there’s no question it needs to get done. There’s little doubt that most efforts to coordinate care will fall flat if providers are operating with incomplete information.

And what’s more, with the demand for interoperability baked into MACRA, we pretty much have no choice but to make it happen anyway. To my knowledge, HHS has proposed neither carrot nor stick to convince providers to come on board – nor has it defined “widespread” interoperability to my knowledge — but the agency has to achieve something by 2018, and that means change will come.

That being said, I’m struck by how little industry concern there seems to be about the extent to which interoperability can multiply the possibility of a breach occurring. Unfortunately, security is only as good is the weakest link in the chain, and data sharing increases the length of the chain exponentially. Of course, the risk varies a great deal depending on who or what the data-sharing intermediary is, but the fact remains that a connected network is a connected network.

The problem only gets worse if interoperability is achieved by integrating applications. I’m no software engineer, but I’m pretty sure that the more integrated providers’ infrastructure is, the more vulnerabilities they share. To be fair, hospitals theoretically vet their partners, but that defeats the purpose of universal data sharing, doesn’t it?

And even if every provider in the universal data sharing network practices good security hygiene, they can still get attacked. So it’s not a matter of requiring participants to comply with some network security standard, or meet some certification criteria. Given the massive incentives these have to steal health data (and lock it up with ransomware), nobody can hold out forever.

The bottom line is that I believe we should discuss the matter of security in a fully-connected health data sharing network more often.

Yes, we almost certainly need to press ahead and simply find a way to contain the risks. We simply can’t afford our fragmented healthcare system, and data interoperability offers perhaps the best possible chance of pulling it back together.

But before we plunge into the fray, it only makes sense to stop and consider all of the risks involved and how they should be addressed. After all, universal interconnection exposes a virtually infinite number of potential points of failure to cybercrooks. Let’s put some solutions on the table before it’s too late.