Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

Consumers Fear Theft Of Personal Health Information

Posted on February 15, 2017 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Probably fueled by constant news about breaches – duh! – consumers continue to worry that their personal health information isn’t safe, according to a new survey.

As the press release for the 2017 Xerox eHealth Survey notes, last year more than one data breach was reported each day. So it’s little wonder that the survey – which was conducted online by Harris poll in January 2017 among more than 3,000 U.S. adults – found that 44% of Americans are worried about having their PHI stolen.

According to the survey, 76% of respondents believe that it’s more secure to share PHI between providers through a secure electronic channel than to fax paper documents. This belief is certainly a plus for providers. After all, they’re already committed to sharing information as effectively as possible, and it doesn’t hurt to have consumers behind them.

Another positive finding from the study is that Americans also believe better information sharing across providers can help improve patient care. Xerox/Harris found that 87% of respondents believe that wait times to get test results and diagnoses would drop if providers securely shared and accessed patient information from varied providers. Not only that, 87% of consumers also said that they felt that quality of service would improve if information sharing and coordination among different providers was more common.

Looked at one way, these stats offer providers an opportunity. If you’re already spending tens or hundreds of millions of dollars on interoperability, it doesn’t hurt to let consumers know that you’re doing it. For example, hospitals and medical practices can put signs in their lobby spelling out what they’re doing by way of sharing data and coordinating care, have their doctors discuss what information they’re sharing and hand out sheets telling consumers how they can leverage interoperable data. (Some organizations have already taken some of these steps, but I’d argue that virtually any of them could do more.)

On the other hand, if nearly half of consumers afraid that their PHI is insecure, providers have to do more to reassure them. Though few would understand how your security program works, letting them know how seriously you take the matter is a step forward. Also, it’s good to educate them on what they can do to keep their health information secure, as people tend to be less fearful when they focus on what they can control.

That being said, the truth is that healthcare data security is a mixed bag. According to a study conducted last year by HIMSS, most organizations conduct IT security risk assessments, many IT execs have only occasional interactions with top-level leaders. Also, many are still planning out their medical device security strategy. Worse, provider security spending is often minimal. HIMSS notes that few organizations spend more than 6% of their IT budgets on data security, and 72% have five or fewer employees allocated to security.

Ultimately, it’s great to see that consumers are getting behind the idea of health data interoperability, and see how it will benefit them. But until health organizations do more to protect PHI, they’re at risk of losing that support overnight.

ONC Takes Another Futile Whack At Interoperability

Posted on January 2, 2017 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

With the New Year on its way, ONC has issued its latest missive on how to move the healthcare industry towards interoperability. Its Interoperability Standards Advisory for 2017, an update from last year’s version, offers a collection of standards and implementation specs the agency has identified as important to health data sharing.

I want to say at the outset that this seems a bit, well, strange to me. It really does seem like a waste of time to create a book of curated standards when the industry’s interoperability take changes every five minutes. In fact, it seems like an exercise in futility.

But I digress. Let’s talk about this.

About the ISA

The Advisory includes four technical  sections, covering a) vocabulary/code sets/terminology, b) content/structure standards and implementation specs, c) standards and implementation specs for services and d) models and profiles, plus a fifth section listing ONC’s questions and requesting feedback. This year’s version takes the detailed feedback the ONC got on last year’s version into account.

According to ONC leader Vindell Washington, releasing the ISA is an important step toward achieving the goals the agency has set out in the Shared Nationwide Interoperability Roadmap, as well as the Interoperability Pledge announced earlier this year. There’s little doubt, at minimum, that it represents the consensus thinking of some very smart and thoughtful people.

In theory ONC would appear to be steaming ahead toward meeting its interoperability goals. And one can hardly disagree that it’s overarching goal set forth in the Roadmap, of creating a “learning health system” by 2024 sounds attractive and perhaps doable.

Not only that, at first glance it might seem that providers are getting on board. As ONC notes, companies which provide 90% of EHRs used by hospitals nationwide, as well as the top five healthcare systems in the country, have agreed to the Pledge. Its three core requirements are that participants make it easy for consumers to access their health information, refrain from interfering with health data sharing, and implement federally recognized national interoperability standards.

Misplaced confidence

But if you look at the situation more closely, ONC’s confidence seems a bit misplaced. While there’s much more to its efforts, let’s consider the Pledge as an example of how slippery the road ahead is.

So let’s look at element one, consumer access to data. While agreeing to give patients access is a nice sentiment, to me it seems inevitable that there will be as many forms of data access as there are providers. Sure, ONC or other agencies could attempt to regulate this, but it’s like trying to nail down jello given the circumstances. And what’s more, as soon as we define what adequate consumer access is, some new technology, care model or consumer desire will change everything overnight.

What about information blocking? Will those who took the Pledge be able to avoid interfering with data flows? I’d guess that if nothing else, they won’t be able to support the kind of transparency and sharing ONC would like to see. And then when you throw in those who just don’t think full interoperability is in their interests – but want to seem as though they play well with others – you’ve pretty much got a handful o’ nothing.

And consider the third point of the Pledge, which asks providers to implement “federally recognized” standards. OK, maybe the ISA’s curated specs meet this standard, but as the Advisory is considered “non-binding” perhaps they don’t. OK, so what if there were a set of agreed-upon federal standards? Would the feds be able to keep up with changes in the marketplace (and technology) that would quickly make their chosen models obsolete? I doubt it. So we have another swing and a miss.

Given how easy the Pledge is to challenge, how much weight can we assign to efforts like the ISA or even ONC’s long-term interoperability roadmap? I’d argue that the answer is “not much.” The truth is that at least in its current form, there’s little chance the ONC can do much to foster a long-term, structural change in how health organizations share data. It’d be nice to think that, but thinking doesn’t make it so.

The Case For Accidental Interoperability

Posted on December 22, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Many of us who follow #HITsm on Twitter have encountered the estimable standards guru Keith Boone (known there as @motorcycle_guy). Keith always has something interesting to share, and his recent article, on “accidental” interoperability, is no exception.

In his article, he describes an aha moment: “I had a recent experience where I saw a feature one team was demonstrating in which I could actually integrate one of my interop components to supply additional functionality,” he writes. “When you build interop components right, this kind of accidental interop shows up all the time.”

In his piece, he goes on to argue that this should happen a lot more often, because by doing so, “you can create lot of value through it with very little engineering investment.”

In an ideal world, such unplanned instances of interoperability would happen often, allowing developers and engineers to connect solutions with far less trouble and effort. And the more often that happened, the more resources everyone involved would have to invest in solving other types of problems.

But in his experience, it can be tough to get dev teams into the “component-based” mindset that would allow for accidental interoperability. “All too often I’ve been told those more generalized solutions are ‘scope expansions,’ because they don’t fit the use case,” and any talk of future concerns is dropped, he says.

While focusing on a particular use case can save time, as it allows developers to take shortcuts which optimize their work for that use case, this approach also limits the value of their work, he argues. Unfortunately, this intense focus prevents developers from creating more general solutions that might have broader use.

Instead of focusing solely on their short-term goals, he suggests, health IT leaders may want to look at the bigger picture. “My own experience tells me that the value I get out of more general solutions is well worth the additional engineering attention,” he writes. “It may not help THIS use case, but when I can supply the same solution to the next use case that comes along, then I’ve got a clear win.”

Keith’s article points up an obstacle to interoperability that we don’t think much about right now. While most of what I read about interoperability options — including on this blog — focus on creating inter-arching standards that can tie all providers together, we seldom discussed the smaller, day-to-day decisions that stand in the way of health data sharing.

If he’s right (and I have little doubt that he is) health IT interoperability will become a lot more feasible, a lot more quickly, if health organizations take a look at the bigger purposes an individual development project can meet. Otherwise, the next project may just be another silo in the making.

Don’t Yell FHIR in a Hospital … Yet

Posted on November 30, 2016 I Written By

The following is a guest blog post by Richard Bagdonas, CTO and Chief Healthcare Architect at MI7.
richard-bagdonas
The Fast Healthcare Interoperability Resource standard, commonly referred to as FHIR (pronounced “fire”) has a lot of people in the healthcare industry hopeful for interoperability between the electronic health record (EHR) systems and external systems — enabling greater information sharing.

As we move into value-based healthcare and away from fee-for-service healthcare, one thing becomes clear: care is no longer siloed to one doctor and most certainly not to one facility. Think of the numerous locations a patient must visit when getting a knee replaced. They start at their general practitioner’s office, then go to the orthopedic surgeon, followed by the radiology center, then to the hospital, often back to the ortho’s office, and finally to one or more physical therapists.

Currently the doctor’s incentives are not aligned with the patient. If the surgery needs to be repeated, the insurance company and patient pay for it again. In the future the doctor will be judged and rewarded or penalized for their performance in what is called the patient’s “episode of care.” All of this coordination between providers requires the parties involved become intimately aware of everything happening at each step in the process.

This all took off back in 2011 when Medicare began an EHR incentive program providing $27B in incentives to doctors at the 5,700 hospitals and 235,000 medical practices to adopt EHR systems. Hospitals would receive $2M and doctors would receive $63,750 when they put in the EHR system and performed some basic functions proving they were using it under what has been termed “Meaningful Use” or MU.

EHR manufacturers made a lot of money selling systems leveraging the MU incentives. The problem most hospitals ran into is their EHR didn’t come with integrations to external systems. Integration is typically done using a 30 year old standard called Health Level 7 or HL7. The EHR can talk to outside systems using HL7, but only if the interface is turned on and both systems use the same version. EHR vendors typically charge thousands of dollars and sometimes tens of thousands to turn on each interface. This is why interface engines have been all the rage since they turn one interface into multiple.

The great part of HL7 is it is standard. The bad parts of HL7 are a) there are 11 standards, b) not all vendors use all standards, c) most EHRs are still using version 2.3 which was released in 1997, and d) each EHR vendor messes up the HL7 standard in their own unique way, causing untold headaches for integration project managers across the country. The joke in the industry is if you have seen one EHR integration, you’ve seen “just one.”

image-1
HL7 versions over the years

HL7 version 3.0 which was released in 2005 was supposed to clear up a lot of this integration mess. It used the Extensible Markup Language (XML) to make it easier for software developers to parse the healthcare messages from the EHR, and it had places to stick just about all of the data a modern healthcare system needs for care coordination. Unfortunately HL7 3.0 didn’t take off and many EHRs didn’t build support for it.

FHIR is the new instantiation of HL7 3.0 using JavaScript Object Notation (JSON), and optionally XML, to do similar things using more modern technology concepts such as Representation State Transfer (REST) with HTTP requests to GET, PUT, POST, and DELETE these resources. Developers love JSON.

FHIR is not ready for prime time and based on how HL7 versions have been rolled out over the years it will not be used in a very large percentage of the medical facilities for several years. The problem the FHIR standard created is a method by which a medical facility could port EHR data from one manufacturer to another. EHR manufacturers don’t want to let this happen so it is doubtful they will completely implement FHIR — especially since it is not a requirement of MU.

And FHIR is still not hardened. There have been fifteen versions of FHIR released over the last two years with six incompatible with earlier versions. We are a year away at best from the standard going from draft to release, so plan on there being even more changes.

image-2
15 versions of FHIR since 2014 with 6 that are incompatible with earlier versions

Another reason for questioning FHIR’s impact is the standard has several ways to transmit and receive data besides HTTP requests. One EHR may use sockets, while another uses file folder delivery, while another uses HTTP requests. This means the need for integration engines still exists and as such the value from moving to FHIR may be reduced.

Lastly, the implementation of FHIR’s query-able interface means hospitals will have to decide if they must host all of their data in a cloud-based system for outside entities to use or become a massive data center running the numerous servers it will take to allow patients with mobile devices to not take down the EHR when physicians need it for mission-critical use.

While the data geek inside me loves the idea of FHIR, my decades of experience performing healthcare integrations with EHRs tell me there is more smoke than there is FHIR right now.

My best advice when it comes to FHIR is to keep using the technologies you have today and if you are not retired by the time FHIR hits its adoption curve, look at it with fresh eyes at that time. I will be eagerly awaiting its arrival, someday.

About Richard Bagdonas
Richard Bagdonas has over 12 years integrating software with more than 40 electronic health record system brands. He is an expert witness on HL7 and EDI-based medical billing. Richard served as a technical consultant to the US Air Force and Pentagon in the mid-1990’s and authored 4 books on telecom/data network design and engineering. Richard is currently the CTO and Chief Healthcare Architect at MI7, a healthcare integration software company based in Austin, TX.

What Would A Community Care Plan Look Like?

Posted on November 16, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Recently, I wrote an article about the benefits of a longitudinal patient record and community care plan to patient care. I picked up the idea from a piece by an Orion Health exec touting the benefits of these models. Interestingly, I couldn’t find a specific definition for a community care plan in the article — nor could I dig anything up after doing a Google search — but I think the idea is worth exploring nonetheless.

Presumably, if we had a community care plan in place for each patient, it would have interlocking patient-specific and population health-level elements to it. (To my knowledge, current population health models don’t do this.) Rather than simply handing patients off from one provider to another, in the hope that the rare patient-centered medical home could manage their care effectively on its own, it might set care goals for each patient as part of the larger community strategy.

With such a community care strategy, groups of providers would have a better idea where to allocate resources. It would simultaneously meet the goals of traditional medical referral patterns, in which clinicians consult with one another on strategy, and help them decide who to hire (such as a nurse-practitioner to serve patient clusters with higher levels of need).

As I envision it, a community care plan would raise the stakes for everyone involved in the care process. Right now, for example, if a primary care doctor refers a patient to a podiatrist, on a practical level the issue of whether the patient can walk pain-free is not the PCP’s problem. But in a community-based care plan, which help all of the individual actors be accountable, that podiatrist couldn’t just examine the patient, do whatever they did and punt. They might even be held to quantitative goals, if the they were appropriate to the situation.

I also envision a community care plan as involving a higher level of direct collaboration between providers. Sure, providers and specialists coordinate care across the community, minimally, but they rarely talk to each other, and unless they work for the same practice or health system virtually never collaborate beyond sharing care documentation. And to be fair, why should they? As the system exists today, they have little practical or even clinical incentive to get in the weeds with complex individual patients and look at their future. But if they had the right kind of community care plan in place for the population, this would become more necessary.

Of course, I’ve left the trickiest part of this for last. This system I’ve outlined, basically a slight twist on existing population health models, won’t work unless we develop new methods for sharing data collaboratively — and for reasons I be glad to go into elsewhere, I’m not bullish about anything I’ve seen. But as our understanding of what we need to get done evolves, perhaps the technology will follow. A girl can hope.

A Look At Nursing Home Readiness For HIE Participation

Posted on October 12, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

A newly released study suggests that nursing homes have several steps to go through before they are ready to participate in health information exchanges. The study, which appeared in the AHIMA-backed Perspectives in Health Information Management, was designed to help researchers understand the challenges nursing homes faced in health information sharing, as well as what successes they had achieved to date.

As the study write up notes, the U.S. nursing home population is large — nearly 1.7 million residents spread across 15,600 nursing homes as of 2014. But unlike other settings that care for a high volume of patients, nursing homes haven’t been eligible for EMR incentive programs that might have helped them participate in HIEs.

Not surprisingly, this has left the homes at something of a disadvantage, with very few participating in networked health data sharing. And this is a problem in caring for residents adequately, as their care is complex, involving nurses, physicians, physicians’ offices, pharmacists and diagnostic testing services. So understanding what potential these homes have to connect is a worthwhile topic of study. That’s particularly the case given that little is known about HIE implementation and the value of shared patient records across multiple community-based settings, the study notes.

To conduct the study, researchers conducted interviews with 15 nursing home leaders representing 14 homes in the midwestern United States that participated in the Missouri Quality Improvement Initiative (MOQI) national demonstration project.  Studying MOQI participants helped researchers to achieve their goals, as one of the key technology goals of the CMS-backed project is to develop knowledge of HIE implementations across nursing homes and hospital boundaries and determine the value of such systems to users.

The researchers concluded that incorporating HIE technology into existing work processes would boost use and overall adoption. They also found that participation inside and outside of the facility, and providing employees with appropriate training and retraining, as well as getting others to use the HIE, would have a positive effect on health data sharing projects. Meanwhile, getting the HIE operational and putting policies for technology use were challenges on the table for these institutions.

Ultimately, the study concluded that nursing homes considering HIE adoption should look at three areas of concern before getting started.

  • One area was the home’s readiness to adopt technology. Without the right level of readiness to get started, any HIE project is likely to fail, and nursing home-based data exchanges are no exception. This would be particularly important to a project in a niche like this one, which never enjoyed the outside boost to the emergence of the technology culture which hospitals and doctors enjoyed under Meaningful Use.
  • Another area identified by researchers was the availability of technology resources. While the researchers didn’t specify whether they meant access to technology itself or the internal staff or consultants to execute the project, but both seem like important considerations in light of this study.
  • The final area researchers identified as critical for making a success of HIE adoption in nursing homes was the ability to match new clinical workflows to the work already getting done in the homes. This, of course, is important in any setting where leaders are considering major new technology initiatives.

Too often, discussions of health data sharing leave out major sectors of the healthcare economy like this one. It’s good to take a look at what full participation in health data sharing with nursing homes could mean for healthcare.

The Variables that Need Adjusting to Make Health Data Sharing a Reality

Posted on October 7, 2016 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

During today’s #HITsm chat, John Trader offered this fascinating quote from SusannahFox, CTO at HHS:

I quickly replied with the following:

This concept is definitely worth exploring. There are a lot of things in life that we want. However, that doesn’t mean we want them enough to actually do them. I want to be skinny and muscular. I don’t want it enough to stop eating the way I do and start working out in a way that would help me lose weight and become a chiseled specimen of a man. The problem is that there are different levels of “want.”

This applies so aptly to data sharing in healthcare. Most of us want the data sharing to happen. I’ve gone so far as to say that I think most patients think that the data sharing is already happening. Most patients probably don’t realize that it’s not happening. Most caregivers want the data shared as well. What doctor wants to see a patient with limited information? The more high quality information a doctor has, the better they can do their job. So, yes, they want to share patients data so they can help others (ie. their patients).

The problem is that most patients and caregivers don’t want it enough. They’re ok with data sharing. They think that data sharing is beneficial. They might even think that data sharing is the right thing to do. However, they don’t want it enough to make it a reality.

It’s worth acknowledging that there’s a second part of this equation: Difficulty. If something is really difficult to do, then your level of “want” needs to be extremely high to overcome those difficulties. If something is really easy to do, then your level of want can be much lower.

For the programmer geeks out there:

If (Difficulty > Want) Then End

If (Difficulty < Want) Then ResultAchieved

When we talk about healthcare data sharing, it’s really difficult to do and people’s “want” is generally low. There are a few exceptions. Chronically ill patients have a much bigger “want” to solve the problem of health data sharing. So, some of them overcome the difficulty and are able to share the data. Relatively healthy patients don’t have a big desire to get and share their health data, so they don’t do anything to overcome the challenge of getting and sharing that data.

If we want health data sharing, we have to change the variables. We can either make health data sharing easier (something many are working to accomplish) or we can provide (or create the perception of) more value to patients and caregivers so that they “want” it more. Until that happens, we’re unlikely to see things change.

CommonWell and Healthcare Interoperability

Posted on October 6, 2016 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Healthcare Scene sat down with Scott Stuewe, Director at Cerner Network and Daniel Cane, CEO & Co-Founder at Modernizing Medicine, where we talked about Cerner’s participation in CommonWell and Modernizing Medicine’s announcement to join CommonWell. This was a great opportunity to learn more about the progress CommonWell has made.

During our discussion, we talk about where CommonWell is today and where it’s heading in the future. Plus, we look at some of the use cases where CommonWell works today and where they haven’t yet build out that capability. We also talk about how the CommonWell member companies are working together to make healthcare interoperability a reality. Plus, we talk a bit about the array of interoperability solutions that will be needed beyond CommonWell. Finally, we look at where healthcare interoperability is headed.

In the “After Party” video we continued our conversation with Scott and Daniel where we talked about the governance structure for CommonWell and how it made decisions. We also talked about the various healthcare standards that are available and where we’re at in the development of those standards. Plus, we talk about the potential business model for EHR vendors involved in CommonWell. Scott and Daniel finish off by talking about what we really need to know about CommonWell and where it’s heading.

CommonWell is a big part of many large EHR vendors interoperability plans. Being familiar with what they’re doing is going to be important to understand how healthcare data sharing will or won’t happen in the future.

Validic Survey Raises Hopes of Merging Big Data Into Clinical Trials

Posted on September 30, 2016 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

Validic has been integrating medical device data with electronic health records, patient portals, remote patient monitoring platforms, wellness challenges, and other health databases for years. On Monday, they highlighted a particularly crucial and interesting segment of their clientele by releasing a short report based on a survey of clinical researchers. And this report, although it doesn’t go into depth about how pharmaceutical companies and other researchers are using devices, reveals great promise in their use. It also opens up discussions of whether researchers could achieve even more by sharing this data.

The survey broadly shows two trends toward the productive use of device data:

  • Devices can report changes in a subject’s condition more quickly and accurately than conventional subject reports (which involve marking observations down by hand or coming into the researcher’s office). Of course, this practice raises questions about the device’s own accuracy. Researchers will probably splurge for professional or “clinical-grade” devices that are more reliable than consumer health wearables.

  • Devices can keep the subject connected to the research for months or even years after the end of the clinical trial. This connection can turn up long-range side effects or other impacts from the treatment.

Together these advances address two of the most vexing problems of clinical trials: their cost (and length) and their tendency to miss subtle effects. The cost and length of trials form the backbone of the current publicity campaign by pharma companies to justify price hikes that have recently brought them public embarrassment and opprobrium. Regardless of the relationship between the cost of trials and the cost of the resulting drugs, everyone would benefit if trials could demonstrate results more quickly. Meanwhile, longitudinal research with massive amounts of data can reveal the kinds of problems that led to the Vioxx scandal–but also new off-label uses for established medications.

So I’m excited to hear that two-thirds of the respondents are using “digital health technologies” (which covers mobile apps, clinical-grade devices, and wearables) in their trials, and that nearly all respondents plan to do so over the next five years. Big data benefits are not the only ones they envision. Some of the benefits have more to do with communication and convenience–and these are certainly commendable as well. For instance, if a subject can transmit data from her home instead of having to come to the office for a test, the subject will be much more likely to participate and provide accurate data.

Another trend hinted at by the survey was a closer connection between researchers and patient communities. Validic announced the report in a press release that is quite informative in its own right.

So over the next few years we may enter the age that health IT reformers have envisioned for some time: a merger of big data and clinical trials in a way to reap the benefits of both. Now we must ask the researchers to multiply the value of the data by a whole new dimension by sharing it. This can be done in two ways: de-identifying results and uploading them to public or industry-maintained databases, or providing identifying information along with the data to organizations approved by the subject who is identified. Although researchers are legally permitted to share de-identified information without subjects’ consent (depending on the agreements they signed when they began the trials), I would urge patient consent for all releases.

Pharma companies are already under intense pressure for hiding the results of trials–but even the new regulations cover only results, not the data that led to those results. Organizations such as Sage Bionetworks, which I have covered many times, are working closely with pharmaceutical companies and researchers to promote both the software tools and the organizational agreements that foster data sharing. Such efforts allow people in different research facilities and even on different continents to work on different aspects of a target and quickly share results. Even better, someone launching a new project can compare her data to a project run five years before by another company. Researchers will have millions of data points to work with instead of hundreds.

One disappointment in the Validic survey was a minority of respondents saw a return on investment in their use of devices. With responsible data sharing, the next Validic survey may raise this response rate considerably.

Please, No More HIE “Coming Of Age” Stories

Posted on September 29, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Today I read a Modern Healthcare story suggesting that health information exchanges are “coming of age,” and after reading it, I swear my eyes almost rolled back into my head. (An ordinary eye roll wouldn’t do.)

The story leads with the assertion that a new health data sharing deal, in which Texas Health Resources agreed to share data via a third-party HIE, suggests that such HIEs are becoming sustainable.

Author Joseph Conn writes that the 14-hospital system is coming together with 32 other providers sending data to Healthcare Access San Antonio, an entity which supports roughly 2,400 HIE users and handles almost 2.2 million patient records. He notes that the San Antonio exchange is one of about 150 nationwide, hardly a massive number for a country the size of the U.S.

In partial proof of his assertion that HIEs are finding their footing, he notes that that from 2010 to 2015, the number of HIEs in the U.S. fluctuated but saw a net gain of 41%, according to federal stats. And he attributes this growth to pressure on providers to improve care, lower costs and strengthen medical research, or risk getting Medicare or Medicaid pay cuts.

I don’t dispute that there is increased pressure on providers to meet some tough goals. Nor am I arguing that many healthcare organizations believe that healthcare data sharing via an HIE can help them meet these goals.

But I would argue that even given the admittedly growing pressure from federal regulators to achieve certain results, history suggests that an HIE probably isn’t the way to get this done, as we don’t seem to have found a business model for them that works over the long term.

As Conn himself notes, seven recipients of federal, state-wide HIE grants issued by the ONC — awarded in Connecticut, Illinois, Montana, Nevada, New Hampshire, Puerto Rico and Wyoming — went out of business after the federal grants dried up. So were not talking about HIEs’ ignoble history of sputtering out, we’re talking about fairly recent failures.

He also notes that a commercially-funded model, MetroChicago HIE, which connected more than 30 northeastern Illinois hospitals, went under earlier this year. This HIE failed because its most critical technology vendor suddenly went out of business with 2 million patient records in its hands.

As for HASA, the San Antonio exchange discussed above, it’s not just a traditional HIE. Conn’s piece notes that most of the hospitals in the Dallas-Fort Worth area have already implemented or plan to use an Epic EMR and share clinical messages using its information exchange capabilities. Depending on how robust the Epic data-sharing functions actually are, this might offer something of a solution.

But what seems apparent to me, after more than a decade of watching HIEs flounder, is that a data-sharing model relying on a third-party platform probably isn’t financially or competitively sustainable.

The truth is, a veteran editor like Mr. Conn (who apparently has 35 years of experience under his belt) must know that his reporting doesn’t sustain the assertion that HIEs are coming into some sort of golden era. A single deal undertaken by even a large player like Texas Health Resources doesn’t prove that HIEs are seeing a turnaround. It seems that some people think the broken clock that is the HIE model will be right at least once.

P.S.  All of this being said, I admit that I’m intrigued by the notion of  “public utility” HIE. Are any of you associated with such a project?