Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

Designing for the Whole Patient Journey: Lumeon Enters the US Health Provider Market

Posted on April 23, 2018 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

Lots of companies strive to unshackle health IT’s potential to make the health care industry more engaging, more adaptable, and more efficient. Lumeon intrigues me in this space because they have a holistic approach that seems to be producing good results in the UK and Europe–and recently they have entered the US market.

Superficially, the elements of the Lumeon platform echo advances made by many other health IT applications. Alerts and reminders? Check. Workflow automation? Check. Integration with a variety of EHRs? Of course! But there is something more to Lumeon’s approach to design that makes it a significant player. I had the opportunity to talk to Andrew Wyatt, Chief Operating Officer, to hear what he felt were Lumeon’s unique strengths.

Before discussing the platform itself, we have to understand Lumeon’s devotion to understanding the patient’s end-to-end experience, also sometimes known as the patient journey. Lumeon is not so idealistic as to ask providers to consider a patient’s needs from womb to tomb–although that would certainly help. But they ask such questions as: can the patient physically get to appointments? Can she navigate her apartment building’s stairs and her apartment after discharge from surgery? Can she get her medication?

Lumeon workflow view

*Lumeon workflow view

Such questions are the beginning of good user experience design (UX), and are critical to successful treatment. This is why I covered the HxRefactored conference in Boston in 2016 and 2017. Such questions were central to the conference.

It’s also intriguing that criminal justice reformers focus attention on the whole sequence of punishment and rehabilitation, including reentry into mainstream society.

Thinking about every step of the patient experience, before and after treatments as well as when she enters the office, is called a longitudinal view. Even in countries with national health care systems, less than half the institutions take such a view, and adoption of the view is growing only slowly.

Another trait of longitudinal thinking Wyatt looks for is coordinated care with strong involvement from the family. The main problem he ascribed to current health IT systems is that they serve the clinician. (I think many doctors would dispute this, saying that the systems serve only administrators and payers–not the clinician or the patient.)

Here are a couple success stories from Wyatt. After summarizing them, I’ll look at the platform that made them possible.

Alliance Medical, a major provider of MRI scans and other imaging services, used Lumeon to streamline the entire patient journey, from initial referral to delivery of final image and report. For instance, an online form asks patients during the intake process whether the patient has metal in his body, which would indicate the use of an alternative test instead of an MRI. The next question then becomes what test would meet the current diagnostic needs and be reimbursed by the payer. Lumeon automates these logistical tasks. After the test, automation provided by the Lumeon platform can make sure that a clinician reviews the image within the required time and that the image gets to the people who need it.

Another large provider in ophthalmology looked for a way to improve efficiency and outcomes in the common disease of glaucoma, by putting images of the eye in a cloud and providing a preliminary, automated diagnosis that the doctor would check. None of the cloud and telemedicine solutions covered ophthalmology, so the practice used the Lumeon platform to create one. The design process functioned as a discipline allowing them to put a robust process for processing patients in place, leading to better outcomes. From the patient’s point of view, the change was even more dramatic: they could come in to the office just once instead of four times to get their diagnosis.

An imaging provider found that they wasted 5 to 10 minutes each time they moved a machine between an upper body position and a lower body position. They saved many hours–and therefore millions of dollars–simply by scheduling all the upper body scans for one part of the day and all lower body scans for another. Lumeon made this planning possible.

In most of the US, value-based care is still in its infancy. The longitudinal view is not found widely in health care. But Wyatt says his service can help businesses stuck in the fee-for-service model too. For example, one surgical practice suffered lots of delays and cancellations because the necessary paperwork wasn’t complete the day before surgery. Lumeon helped them build a system that knew what tests were needed before each surgery and that prompted staff to get them done on time. The system required coordination of many physicians and labs.

Another example of a solution that is valuable in fee-for-service contexts is creating a reminder for calling colonoscopy patients when they need to repeat the procedure. Each patient has to be called at a different time interval, which can be years in the future.

Lumeon has been in business 12 years and serves about 60 providers in the UK and Europe, some very large. They provide the service on a SaaS basis, running on a HIPAA-compliant AWS cloud except in the UK, where they run their own data center in order to interact with legacy National Health Service systems.

The company has encountered along the way an enormous range of health care disciplines, with organizations ranging from small to huge in size, and some needing only a simple alerting service while others re-imagined the whole patient journey. Wyatt says that their design process helps the care provider articulate the care pathway they want to support and then automate it. Certainly, a powerful and flexible platform is needed to support so many services. As Wyatt said, “Health care is not linear.” He describes three key parts to the Lumeon system:

  1. Integration engine. This is what allows them to interact with the EHR, as well as with other IT systems such as Salesforce. Often, the unique workflow system developed by Lumeon for the site can pop up inside the EHR interface, which is important because doctors hate to exit a workflow and start up another.

    Any new system they encounter–for instance, some institutions have unique IT systems they created in-house–can be plugged in by developing a driver for it. Wyatt made this seem like a small job, which underscores that a lack of data exchange among hospitals is due to business and organizational factors, not technical EHR problems. Web services and a growing support for FHIR make integration easier

  2. Communications. Like the integration engine, this has a common substrate and a multiplicity of interfaces so doctors, patients, and all those involved in the health care journey can use text, email, web forms, and mobile apps as they choose.

  3. Workflow or content engine. Once they learn the system, clinicians can develop pathways without going back to Lumeon for support. The body scan solution mentioned earlier is an example of a solution designed and implemented entirely by the clinical service on its own.

  4. Transparency is another benefit of a good workflow design. In most environments, staff must remember complex sequences of events that vary from patient to patient (ordering labs, making referrals, etc.). The sequence is usually opaque to the patient herself. A typical Lumeon design will show the milestones in a visual form so everybody knows what steps took place and what remain to be done.

Wyatt describes Lumeon as a big step beyond most current workflow and messaging solutions. It will be interesting to watch the company’s growth, and to see which of its traits are adopted by other health IT firms.

NY-Based HIE Captures One Million Patient Consents

Posted on September 28, 2017 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

One of the big obstacles to the free exchange of health data is obtaining patient consent to share that data. It’s all well and good if we can bring exchange partners onto a single data sharing format, but if patients don’t consent to that exchange things get ugly. It’s critical that healthcare organizations solve this problem, because without patient consent HIEs are dead in the water.

Given these issues, I was intrigued to read a press release from HEALTHeLINK, an HIE serving Western New York, which announced that it had obtained one million patient consents to share their PHI. HEALTHeLINK connects nearly 4,600 physicians, along with hospitals, health plans and other healthcare providers. It’s part of a larger HIE, the Statewide Health Information Network of New York.

How did HEALTHeLINK obtain the consents? Apparently, there was no magic involved. The HIE made consent forms available at hospitals and doctors’ offices throughout its network, as well as making the forms available for download at whyhealthelink.com. (It may also have helped that they can be downloaded in any of 12 languages.)

I downloaded the consent form myself, and I must say it’s not complicated.

Patients only need to fill out a single page, which gives them the option to a) permit participating providers to access all of their electronic health information via the HIE, b) allow full access to the data except for specific participants, c) permit health data sharing only with specific participants, d) only offer access to their records in an emergency situation, and e) forbid HIE participants to access their health data even in the case of an emergency situation.

About 95% of those who consented chose option a, which seems a bit remarkable to me. Given the current level of data breaches in news, I would’ve predicted that more patients would opt out to some degree.

Nonetheless, the vast majority of patients gave treating providers the ability to view their lab reports, medication history, diagnostic images and several additional categories of health information.

I wish I could tell you what HEALTHeLINK has done to inspire trust, but I don’t know completely. I suspect, however, that provider buy-in played a significant role here. While none of this is mentioned in the HIE’s press release or even on its website, I’m betting that the HIE team did a good job of firing up physicians. After all, if you’re going to pick someone patients would trust, physicians would be your best choice.

On the other hand, it’s also possible patients are beginning to get the importance of having all of the data available during care. While much of health IT is too abstruse for the layman (or woman), the idea that doctors need to know your medical history is clearly beginning to resonate with your average patient.

Direct, Sequoia Interoperability Projects Continue To Grow

Posted on May 15, 2017 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

While its fate may still be uncertain – as with any interoperability approach in this day and age – the Direct exchange network seems to be growing at least. At the same time, it looks like the Sequoia Project’s interoperability efforts, including the Carequality Interoperability Framework and its eHealthExchange Network, are also expanding rapidly.

According to a new announcement from DirectTrust, the number of health information service providers who engaged in Direct exchanges increased 63 percent during the first quarter of 2017, to almost 95,000, over the same period in 2016.  And, to put this growth in perspective, there were just 5,627 providers involved in Q1 of 2014.

Meanwhile, the number of trusted Direct addresses which could share PHI grew 21 percent, to 1.4 million, as compared with the same quarter of 2016. Again, for perspective, consider that there were only 182,279 such addresses available three years ago.

In addition, the Trust noted, there were 35.6 million Direct exchange transactions during the quarter, up 76 percent over the same period last year. It expects to see transaction levels hit 140 million by the end of this year.

Also, six organizations joined DirectTrust during the first quarter of 2017, including Sutter Health, the Health Record Banking Alliance, Timmaron Group, Moxe Health, Uticorp and Anne Arundel Medical Center. This brings the total number of members to 124.

Of course, DirectTrust isn’t the only interoperability group throwing numbers around. In fact, Seqouia recently issued a statement touting its growth numbers as well (on the same day as the Direct announcement, natch).

On that day, the Project announced that the Carequality Interoperability Framework had been implemented by more than 19,000 clinics, 800 hospitals and 250,000 providers.

It also noted that its eHealth Exchange Network, a healthcare data sharing network, had grown 35 percent over the past year, connecting participants in 65 percent of all US hospitals, 46 regional and state HIEs, 50,000 medical groups, more than 3,400 dialysis centers and 8,300 pharmacies. This links together more than 109, million patients, Sequoia reported.

So what does all of this mean? At the moment, it’s still hard to tell:

  • While Direct and Sequoia are expanding pretty quickly, there’s few phenomena to which we can compare their growth.
  • Carequality and CommonWell agreed late last year to share data across each others’ networks, so comparing their transaction levels to other entities would probably be deceiving.
  • Though the groups’ lists of participating providers may be accurate, many of those providers could be participating in other efforts and therefore be counted multiple times.
  • We still aren’t sure what metrics really matter when it comes to measuring interoperability success. Is it the number of transactions initiated by a provider? The number of data flows received? The number of docs and facilities who do both and/or incorporate the data into their EMR?

As I see it, the real work going forward will be for industry leaders to decide what kind of performance stats actually equate to interoperability success. Otherwise, we may not just be missing health sharing bullseyes, we may be firing at different targets.

Exchange Value: A Review of Our Bodies, Our Data by Adam Tanner (Part 3 of 3)

Posted on January 27, 2017 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

The previous part of this article raised the question of whether data brokering in health care is responsible for raising or lower costs. My argument that it increases costs looks at three common targets for marketing:

  • Patients, who are targeted by clinicians for treatments they may not need or have thought of

  • Doctors, who are directed by pharma companies toward expensive drugs that might not pay off in effectiveness

  • Payers, who pay more for diagnoses and procedures because analytics help doctors maximize charges

Tanner flags the pharma industry for selling drugs that perform no better than cheaper alternatives (Chapter 13, page 146), and even drugs that are barely effective at all despite having undergone clinical trials. Anyway, Tanner cites Hong Kong and Europe as places far more protective of personal data than the United States (Chapter 14, page 152), and they don’t suffer higher health care costs–quite the contrary.

Strangely, there is no real evidence so far that data sales have produced either harm to patients or treatment breakthroughs (Conclusion, 163). But the supermarket analogy does open up the possibility that patients could be induced to share anonymized data voluntarily by being reimbursed for it (Chapter 14, page 157). I have heard this idea aired many times, and it fits with the larger movement called Vendor Relationship Management. The problem with such ideas is the close horizon limiting our vision in a fast-moving technological world. People can probably understand and agree to share data for particular research projects, with or without financial reimbursement. But many researchers keep data for decades and recombine it with other data sets for unanticipated projects. If patients are to sign open-ended, long-term agreements, how can they judge the potential benefits and potential risks of releasing their data?

Data for sale, but not for treatment

In Chapter 11, Tanner takes up the perennial question of patient activists: why can drug companies get detailed reports on patient conditions and medications, but my specialist has to repeat a test on me because she can’t get my records from the doctor who referred me to her? Tanner mercifully shields here from the technical arguments behind this question–sparing us, for instance, a detailed discussion of vagaries in HL7 specifications or workflow issues in the use of Health Information Exchanges–but strongly suggests that the problem lies with the motivations of health care providers, not with technical interoperability.

And this makes sense. Doctors do not have to engage in explicit “blocking” (a slippery term) to keep data away from fellow practitioners. For a long time they were used to just saying “no” to requests for data, even after that was made illegal by HIPAA. But their obstruction is facilitated by vendors equally uninterested in data exchange. Here Tanner discards his usual pugilistic journalism and gives Judy Faulkner an easy time of it (perhaps because she was a rare CEO polite enough to talk to him, and also because she expressed an ethical aversion to sharing patient data) and doesn’t air such facts as the incompatibilities between different Epic installations, Epic’s tendency to exchange records only with other Epic installations, and the difficulties it introduces toward companies that want to interconnect.

Tanner does not address a revolution in data storage that many patient advocates have called for, which would at one stroke address both the Chapter 11 problem of patient access to data and the book’s larger critique of data selling: storing the data at a site controlled by the patient. If the patient determined who got access to data, she would simply open it to each new specialist or team she encounters. She could also grant access to researchers and even, if she chooses, to marketers.

What we can learn from Chapter 9 (although Tanner does not tell us this) is that health care organizations are poorly prepared to protect data. In this woeful weakness they are just like TJX (owner of the T.J. Maxx stores), major financial institutions, and the Democratic National Committee. All of these leading institutions have suffered breaches enabled by weak computer security. Patients and doctors may feel reluctant to put data online in the current environment of vulnerability, but there is nothing special about the health care field that makes it more vulnerable than other institutions. Here again, storing the data with the individual patient may break it into smaller components and therefore make it harder for attackers to find.

Patient health records present new challenges, but the technology is in place and the industry can develop consent mechanisms to smooth out the processes for data exchange. Furthermore, some data will still remain with the labs and pharmacies that have to collect it for financial reasons, and the Supreme Court has given them the right to market that data.

So we are left with ambiguities throughout the area of health data collection. There are few clear paths forward and many trade-offs to make. In this I agree ultimately with Tanner. He said that his book was meant to open a discussion. Among many of us, the discussion has already started, and Tanner provides valuable input.

Zero Marginal Cost and Healthcare

Posted on December 21, 2016 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I stumbled upon an old post from the always insightful and interesting 3G Doctor blog about the concept of ‘Zero Marginal Cost.’ Here’s a great quote they use in the post from Albert Wenger, Managing Partner at Union Square Ventures:

“Why is everyone going online? It turns out there’s a simple answer to that: Kittens. Everyone wants to see Kittens. Well there’s actually something more to this, there’s something serious to this because when I downloaded this image from Flickr there was no noticeable cost to anybody. The marginal cost of creating a copy in the digital world is zero and that is driving all the changes that we’re seeing… …and we’re just at the beginning of this change”

I love the concept of zero (or at least near zero) marginal costs. It’s the premise of so many of the amazing things we experience on the internet. What’s troubling is that healthcare hasn’t embraced the idea of zero marginal costs. At least not in the way that it could.

In healthcare, we still like to talk about how much it’s going to cost a patient to get access to their medical records. There are literally state laws which say how much you can charge. Just writing this after writing about the marginal costs of delivering something electronically makes the concept sound silly. Imagine if your bank charged you per sheet to print out your statements each month. That’s basically what we’re asked to do in healthcare.

We’ve started to see some change in this, but there’s still resistance. There’s a real, palpable feel by many in healthcare that giving free access to all of your patient info could lead to really ugly problems. While there might be a few outlier cases people could identify, I’d argue the opposite. Think about the really ugly problems that occur in healthcare because patients don’t have their health information.

It’s time for healthcare to put down their excuses and embrace the benefits that zero marginal costs of sharing health information can provide. I’m not saying we should do it recklessly. We should be thoughtful in how we do it, but we should do it. It’s no longer a technical or security challenge, it’s just a cultural challenge.

Can Interoperability Drive Value-Based Care?

Posted on December 14, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

As the drive to interoperability has evolved over the last few decades — and those of you who are HIT veterans know that these concerns go at least that far back — open data sharing has gone from being a “nice to have” to a presumed necessity for providing appropriate care.

And along the way, backers of interoperability efforts have expanded their goals. While the need to support coordinated care has always been a basis for the discussion, today the assumption is that value-based care simply isn’t possible without data interoperability between providers.

I don’t disagree with the premise. However, I believe that many providers, health systems and ACOs have significant work to do before they can truly benefit from interoperability. In fact, we may be putting the cart before the horse in this case.

A fragmented system

At present, our health system is straining to meet the demand for care coordination among the populations it serves. That may be in part because the level of chronic illness in the US is particularly high. According to one Health Affairs study, two out of three Americans will have a chronic condition by the year 2030. Add that to the need to care for patients with episodic care needs and the problem becomes staggering.

While some health organizations, particularly integrated systems like the Cleveland Clinic and staff-model managed care plans like Kaiser Permanente, plan for and execute well on care coordination, most others have too many siloes in place to do the job correctly. Though many health systems have installed enterprise EMRs like Epic and Cerner, and share data effectively while the patient remains down in their system, they may do very little to integrate information from community providers, pharmacies, laboratories or diagnostic imaging centers.

I have no doubt that when needed, individual providers collect records from these community organizations. But collecting records on the fly is no substitute for following patients in a comprehensive way.

New models required

Given this history, I’d argue that many health systems simply aren’t ready to take full advantage of freely shared health data today, much less under value-based care payment models of the future.

Before they can use interoperable data effectively, provider organizations will need to integrate outside data into their workflow. They’ll need to put procedures in place on how care coordination works in their environment. This will include not only deciding who integrates of outside data and how, but also how organizations will respond as a whole.

For example, hospitals and clinics will need to figure out who handles care coordination tasks, how many resources to pour into this effort, how this care coordination effort fits into the larger population health strategy and how to measure whether they are succeeding or failing in their care coordination efforts. None of these are trivial tasks, and the questions they raise won’t be answered overnight.

In other words, even if we achieved full interoperability across our health system tomorrow, providers wouldn’t necessarily be able to leverage it right away. In other words, unfettered health data sharing won’t necessarily help providers win at value-based care, at least not right away. In fact, I’d argue that it’s dangerous to act as though interoperability can magically make this happen. Even if full interoperability is necessary, it’s not sufficient. (And of course, even getting there seems like a quixotic goal to some, including myself.)

Planning ahead

That being said, health organizations probably do have time to get their act together on this front. The move to value-based care is happening quickly, but not at light speed, so they do have time to make plans to leverage interoperable health data.

But unless they acknowledge the weaknesses of their current system, which in many cases is myopic, siloed and rigid, interoperability may do little to advance their long-term goals. They’ll have to admit that their current systems are far too inward-looking, and that the problem will only go away if they take responsibility for fixing it.

Otherwise, even full interoperability may do little to advance value-based care. After all, all the data in the world won’t change anything on its own.

A Look At Nursing Home Readiness For HIE Participation

Posted on October 12, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

A newly released study suggests that nursing homes have several steps to go through before they are ready to participate in health information exchanges. The study, which appeared in the AHIMA-backed Perspectives in Health Information Management, was designed to help researchers understand the challenges nursing homes faced in health information sharing, as well as what successes they had achieved to date.

As the study write up notes, the U.S. nursing home population is large — nearly 1.7 million residents spread across 15,600 nursing homes as of 2014. But unlike other settings that care for a high volume of patients, nursing homes haven’t been eligible for EMR incentive programs that might have helped them participate in HIEs.

Not surprisingly, this has left the homes at something of a disadvantage, with very few participating in networked health data sharing. And this is a problem in caring for residents adequately, as their care is complex, involving nurses, physicians, physicians’ offices, pharmacists and diagnostic testing services. So understanding what potential these homes have to connect is a worthwhile topic of study. That’s particularly the case given that little is known about HIE implementation and the value of shared patient records across multiple community-based settings, the study notes.

To conduct the study, researchers conducted interviews with 15 nursing home leaders representing 14 homes in the midwestern United States that participated in the Missouri Quality Improvement Initiative (MOQI) national demonstration project.  Studying MOQI participants helped researchers to achieve their goals, as one of the key technology goals of the CMS-backed project is to develop knowledge of HIE implementations across nursing homes and hospital boundaries and determine the value of such systems to users.

The researchers concluded that incorporating HIE technology into existing work processes would boost use and overall adoption. They also found that participation inside and outside of the facility, and providing employees with appropriate training and retraining, as well as getting others to use the HIE, would have a positive effect on health data sharing projects. Meanwhile, getting the HIE operational and putting policies for technology use were challenges on the table for these institutions.

Ultimately, the study concluded that nursing homes considering HIE adoption should look at three areas of concern before getting started.

  • One area was the home’s readiness to adopt technology. Without the right level of readiness to get started, any HIE project is likely to fail, and nursing home-based data exchanges are no exception. This would be particularly important to a project in a niche like this one, which never enjoyed the outside boost to the emergence of the technology culture which hospitals and doctors enjoyed under Meaningful Use.
  • Another area identified by researchers was the availability of technology resources. While the researchers didn’t specify whether they meant access to technology itself or the internal staff or consultants to execute the project, but both seem like important considerations in light of this study.
  • The final area researchers identified as critical for making a success of HIE adoption in nursing homes was the ability to match new clinical workflows to the work already getting done in the homes. This, of course, is important in any setting where leaders are considering major new technology initiatives.

Too often, discussions of health data sharing leave out major sectors of the healthcare economy like this one. It’s good to take a look at what full participation in health data sharing with nursing homes could mean for healthcare.

The Variables that Need Adjusting to Make Health Data Sharing a Reality

Posted on October 7, 2016 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

During today’s #HITsm chat, John Trader offered this fascinating quote from SusannahFox, CTO at HHS:

I quickly replied with the following:

This concept is definitely worth exploring. There are a lot of things in life that we want. However, that doesn’t mean we want them enough to actually do them. I want to be skinny and muscular. I don’t want it enough to stop eating the way I do and start working out in a way that would help me lose weight and become a chiseled specimen of a man. The problem is that there are different levels of “want.”

This applies so aptly to data sharing in healthcare. Most of us want the data sharing to happen. I’ve gone so far as to say that I think most patients think that the data sharing is already happening. Most patients probably don’t realize that it’s not happening. Most caregivers want the data shared as well. What doctor wants to see a patient with limited information? The more high quality information a doctor has, the better they can do their job. So, yes, they want to share patients data so they can help others (ie. their patients).

The problem is that most patients and caregivers don’t want it enough. They’re ok with data sharing. They think that data sharing is beneficial. They might even think that data sharing is the right thing to do. However, they don’t want it enough to make it a reality.

It’s worth acknowledging that there’s a second part of this equation: Difficulty. If something is really difficult to do, then your level of “want” needs to be extremely high to overcome those difficulties. If something is really easy to do, then your level of want can be much lower.

For the programmer geeks out there:

If (Difficulty > Want) Then End

If (Difficulty < Want) Then ResultAchieved

When we talk about healthcare data sharing, it’s really difficult to do and people’s “want” is generally low. There are a few exceptions. Chronically ill patients have a much bigger “want” to solve the problem of health data sharing. So, some of them overcome the difficulty and are able to share the data. Relatively healthy patients don’t have a big desire to get and share their health data, so they don’t do anything to overcome the challenge of getting and sharing that data.

If we want health data sharing, we have to change the variables. We can either make health data sharing easier (something many are working to accomplish) or we can provide (or create the perception of) more value to patients and caregivers so that they “want” it more. Until that happens, we’re unlikely to see things change.

CommonWell and Healthcare Interoperability

Posted on October 6, 2016 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Healthcare Scene sat down with Scott Stuewe, Director at Cerner Network and Daniel Cane, CEO & Co-Founder at Modernizing Medicine, where we talked about Cerner’s participation in CommonWell and Modernizing Medicine’s announcement to join CommonWell. This was a great opportunity to learn more about the progress CommonWell has made.

During our discussion, we talk about where CommonWell is today and where it’s heading in the future. Plus, we look at some of the use cases where CommonWell works today and where they haven’t yet build out that capability. We also talk about how the CommonWell member companies are working together to make healthcare interoperability a reality. Plus, we talk a bit about the array of interoperability solutions that will be needed beyond CommonWell. Finally, we look at where healthcare interoperability is headed.

In the “After Party” video we continued our conversation with Scott and Daniel where we talked about the governance structure for CommonWell and how it made decisions. We also talked about the various healthcare standards that are available and where we’re at in the development of those standards. Plus, we talk about the potential business model for EHR vendors involved in CommonWell. Scott and Daniel finish off by talking about what we really need to know about CommonWell and where it’s heading.

CommonWell is a big part of many large EHR vendors interoperability plans. Being familiar with what they’re doing is going to be important to understand how healthcare data sharing will or won’t happen in the future.

Healthcare Data Standards Tweetstorm from Arien Malec

Posted on May 20, 2016 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

If you don’t follow Arien Malec on Twitter, you should. He’s got strong opinions and an inside perspective on the real challenges associated with healthcare data interoperability.

As proof, check out the following Healthcare Standards tweetstorm he posted (removed from the tweet for easy reading):

1/ Reminder: #MU & CEHRT include standards for terminology, content, security & transport. Covers eRx, lab, Transitions of Care.

2/ If you think we “don’t have interop” b/c no standards name, wrong.

3/ Standards could be ineffective, may be wrong, may not be implemented in practice, or other elts. missing

4/ But these are *different* problems from “gov’t didn’t name standards” & fixes are different too.

5/ e.g., “providers don’t want 60p CCDA documents” – data should be structured & incorporated.

6/ #actually both (structured data w/terminology & incorporate) are required by MU/certification.

7/ “but they don’t work” — OK, why? & what’s the fix?

8/ “Government should have invested in making the standards better”

9/ #actually did. NLM invested in terminology. @ONC_HealthIT invested in CCDA & LRU projects w/ @HL7, etc.

10/ “government shouldn’t have named standards unless they were known to work” — would have led to 0 named

11/ None of this is to say we don’t have silos, impediments to #interoperability, etc.

12/ but you can’t fix the problem unless you understand it first.

13/ & “gov’t didn’t name standards” isn’t the problem.

14/ So describe the problems, let’s work on fixing them, & abandon magical thinking & 🦄. The End.

Here was my immediate response to the tweetstorm:

I agree with much of what Arien says about their being standards and the government named the standards. That isn’t the reason that exchange of health information isn’t happening. As he says in his 3rd tweet above, the standards might not be effective, they may be implemented poorly, the standards might be missing elements, etc etc etc. However, you can’t say there wasn’t a standard and that the government didn’t choose a standard.

Can we just all be honest with ourselves and admit that many people in healthcare don’t want health data to be shared? If they did, we’d have solved this problem.

The good news is that there are some signs that this is changing. However, changing someone from not wanting to share data is a hard thing and usually happens in steps. You don’t just over night have a company or individual change their culture to one of open data sharing.