Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

Health Data Standardization Project Proposes “One Record Per Person” Model

Posted on October 13, 2017 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

When we sit around the ol’ HIT campfire and swap interoperability stories, many of us have little to do but gripe.

Is FHIR going to solve all of our interoperability problems? Definitely not right away, and who knows if it ever will? Can we get the big EMR vendors to share and share alike? They’ll try, but there’s always a catch. And so on. There’s always a major catch involved.

I don’t know if the following offers a better story than any of the others, but at least it’s new one, or at least new to me. Folks, I’m talking about the Standard Health Record, an approach to health data sharing doesn’t fall precisely any of the other buckets I’m aware of.

SHR is based at The MITRE Corporation, which also hosts virtual patient generator Synthea. Rather than paraphrase, let’s let the MITRE people behind SHR tell you what they’re trying to accomplish:

The Standard Health Record (SHR) provides a high quality, computable source of patient information by establishing a single target for health data standardization… Enabled through open source technology, the SHR is designed by, and for, its users to support communication across homes and healthcare systems.

Generalities aside, what is an SHR? According to the project website, the SHR specification will contain all information critical to patient identification, emergency care and primary care along with background on social determinants of health. In the future, the group expects the SHR to support genomics, microbiomics and precision medicine.

Before we dismiss this as another me-too project, it’s worth giving the collaborative’s rationale a look:

The fundamental problem is that today’s health IT systems contain semantically incompatible information. Because of the great variety of the data models of EMR/EHR systems, transferring information from one health IT system to another frequently results in the distortion or loss of information, blocking of critical details, or introduction of erroneous data. This is unacceptable in healthcare.

The approach of the Standard Health Record (SHR) is to standardize the health record and health data itself, rather than focusing on exchange standards.

As a less-technical person, I’m not qualified to say whether this can be done in a way that will be widely accepted, but the idea certainly seems intuitive.

In any event, no one is suggesting that the SHR will change the world overnight. The project seems to be at the beginning stages, with collaborators currently prototyping health record specifications leveraging existing medical record models. (The current SHR spec can be found here.)

Still, I’d love for this to work, because it is at least a fairly straightforward idea. Creating a single source of health data truth seems like it might work.

Consumer Data Liquidity – The Road So Far, The Road Ahead – #HITsm Chat Topic

Posted on August 23, 2017 I Written By

John Lynn is the Founder of the blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of and John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

We’re excited to share the topic and questions for this week’s #HITsm chat happening Friday, 8/25 at Noon ET (9 AM PT). This week’s chat will be hosted by Greg Meyer (@Greg_Meyer93) on the topic of “Consumer Data Liquidity – The Road So Far, The Road Ahead.”

As my summer tour of interoperability forums, lectures, and webinars winds down, patient engagement/data liquidity is arguably the hottest talk in town.  This leads me to a time of reflection looking back to my own personal experience over the last 10-15 years (yes, I’m still a fairly young guy) starting with early attempts to access my own family’s records, moving on to witnessing the consumer revolution of Dave deBronkart and Regina Holiday, and finally tracking the progression of HealthIT and public health legislation.  We’ve come a long way from the ubiquity of paper and binders and Xerox (oh my) to CDs and PDFs to most recently CDAs, Direct, and FHIR with the latter paving the way for a new breed of apps and tools.

With the lightning speed of change in technology and disruption vis-à-vis consumer devices, one would expect a dramatic shift in the consumer experience over the past 10 years with nirvana in the not too distant future.  Contrary to intuitive thinking, we haven’t come as far as we would like to think.  Even with legislation and a progression of technology such as C-CDA, OpenNotes, Direct, BlueButton, FHIR, and the promise of apps to bring it all together, pragmatically a lot of same the core broken processes and frustrations still exist today.  In July, ONC released a study on the health records request process based on a small sampling of consumers and 50 large health organizations.  Although most of the stories include modern technical capabilities, the processes reek of variance and inefficiencies that have persisted since the long lost days of the house call.

Not to put the whole state of affairs in gloom, there is still a potentially bright future not too far ahead.  With the convergence of forces from contemporary technical standards and recent legislation like the 21st Century Cures Act, consumer data liquidity is staying in the forefront of public health.  And let’s not forget the consumer.  It is partly because of the consumer revolution and patients demanding portability of their records that is forcing providers and vendors to open their systems as platforms of accessibility instead of fostering silos and walled gardens.

This week’s chat will explore the progression of health data access from the consumer’s perspective.

Here are the questions that will serve as the framework for this week’s #HITsm chat:
T1: Describe your perception/experiences of consumer data access 10-15 years ago. #HITsm

T2: Contrast your previous experience to today. Is your experience better, worse, or the same? #HITsm

T3: What gaps exist between what is available today (data, apps, networks, etc.) vs what you would like to have? #HITsm

T4: Would you prefer to manage/move your data yourself or expect HealthIT to do it for you. #HITsm

T5: Beyond FHIR, APIs, and apps, what is the future of consumer access and data liquidity? #HITsm

Bonus: Remember “Gimme My DaM Data?” What would be your slogan for consumer access? #HITsm

Upcoming #HITsm Chat Schedule
9/1 – Digital Strategies for Improving Consumer Experience
Hosted by Kyra Hagan (@HIT_Mktg_Maven from @InfluenceHlth)

9/8 – Digital Health Innovation in Pharma
Hosted by Naomi Fried (@naomifried

We look forward to learning from the #HITsm community! As always, let us know if you’d like to host a future #HITsm chat or if you know someone you think we should invite to host.

If you’re searching for the latest #HITsm chat, you can always find the latest #HITsm chat and schedule of chats here.

HL7 Releases New FHIR Update

Posted on April 3, 2017 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

HL7 has announced the release of a new version of FHIR designed to link it with real-world concepts and players in healthcare, marking the third of five planned updates. It’s also issuing the first release of the US Core Implementation Guide.

FHIR release 3 was produced with the cooperation of hundreds of contributors, and the final product incorporates the input of more than 2,400 suggested changes, according to project director Grahame Grieve. The release is known as STU3 (Standard for Trial Use, release 3).

Key changes to the standard include additional support for clinical quality measures and clinical decision support, as well as broader functionality to cover key clinical workflows.

In addition, the new FHIR version includes incremental improvements and increased maturity of the RESTful API, further development of terminology services and new support for financial management. It also defined an RDF format, as well as how FHIR relates to linked data.

HL7 is already gearing up for the release of FHIR’s next version. It plans to publish the first draft of version 4 for comment in December 2017 and review comments on the draft. It will then have a ballot on the version, in April 2018, and publish the new standard by October 2018.

Among those contributing to the development of FHIR is the Argonaut project, which brings together major US EHR vendors to drive industry adoption of FHIR forward. Grieve calls the project a “particularly important” part of the FHIR community, though it’s hard to tell how far along its vendor members have come with the standard so far.

To date, few EHR vendors have offered concrete support for FHIR, but that’s changing gradually. For example, in early 2016 Cerner released an online sandbox for developers designed to help them interact with its platform. And earlier this month, Epic announced the launch of a new program, helping physician practices to build customized apps using FHIR.

In addition to the vendors, which include athenahealth, Cerner, Epic, MEDITECH and McKesson, several large providers are participating. Beth Israel Deaconess Medical Center, Intermountain Healthcare, the Mayo Clinic and Partners HealthCare System are on board, as well as the SMART team at the Boston Children’s Hospital Informatics Program.

Meanwhile, the progress of developing and improving FHIR will continue.  For release 4 of FHIR, the participants will focus on record-keeping and data exchange for the healthcare process. This will encompass clinical data such as allergies, problems and care plans; diagnostic data such observations, reports and imaging studies; medication functions such as order, dispense and administration; workflow features like task, appointment schedule and referral; and financial data such as claims, accounts and coverage.

Eventually, when release 5 of FHIR becomes available, developers should be able to help clinicians reason about the healthcare process, the organization says.

Epic and other EHR vendors caught in dilemmas by APIs (Part 2 of 2)

Posted on March 16, 2017 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site ( and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

The first section of this article reported some news about Epic’s Orchard, a new attempt to provide an “app store” for health care. In this section we look over the role of APIs as seen by EHR vendors such as Epic.

The Roles of EHRs

Dr. Travis Good, with whom I spoke for this article, pointed out that EHRs glom together two distinct functions: a canonical, trusted store for patient data and an interface that becomes a key part of the clinician workflow. They are being challenged in both these areas, for different reasons.

As a data store, EHRs satisfied user needs for many years. The records organized the data for billing, treatment, and compliance with regulations. If there were problems with the data, they stemmed not from the EHRs but from how they were used. We should not blame the EHR if the doctor upcoded clinical information in order to charge more, or if coding was too primitive to represent the complexity of patient illness. But clinicians and regulators are now demanding functions that EHRs are fumbling at fulfillling:

  • More and more regulatory requirements, which intelligent software would calculate on its own from data already in the record, but which most EHRs require the physician to fill out manually

  • Patient-generated data, which may be entered by the patient manually or taken from devices

  • Data in streamlined formats for large-scale data analysis, for which institutions are licensing new forms of databases

Therefore, while the EHR still stores critical data, it is not the sole source of truth and is having to leave its borders porous in order to work with other data sources.

The EHR’s second function, as an interface that becomes part of the clinicians’ hourly workflow, has never been fulfilled well. EHRs are the most hated software among their users. And that’s why users are calling on them to provide APIs that permit third-party developers to compete at the interface level.

So if I were to write a section titled “The Future of Current EHRs” it could conceivably be followed by a blank page. But EHRs do move forward, albeit slowly. They must learn to be open systems.

With this perspective, Orchard looks like doubling down on an obsolete strategy. The limitations and terms of service give the impression that Epic wants to remain a one-stop shopping service for customers. But if Epic adopted the SMART approach, with more tolerance for failure and options for customers, it would start to reap the benefits promised by FHIR and foster health care innovation.

Epic and Other EHR Vendors Caught in Dilemmas by APIs (Part 1 of 2)

Posted on March 15, 2017 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site ( and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

The HITECH act of 2009 (part of the American Recovery and Reinvestment Act) gave an unprecedented boost to an obscure corner of the IT industry that produced electronic health records. For the next eight years they were given the opportunity to bring health care into the 21st century and implement common-sense reforms in data sharing and analytics. They largely squandered this opportunity, amassing hundreds of millions of dollars while watching health care costs ascend into the stratosphere, and preening themselves over modest improvements in their poorly functioning systems.

This was not solely a failure of EHR vendors, of course. Hospitals and clinicians also needed to adopt agile methods of collaborating and using data to reduce costs, and failed to do so. They’re sweating profusely now, as shown in protests by the American Medical Association and major health care providers over legislative changes that will drastically reduce their revenue through cuts to insurance coverage and Medicaid. EHR vendors will feel the pain of a thousand cuts as well.

I recently talked to Dr. Travis Good, CEO of Datica that provides data integration and storage to health care providers. We discussed the state of EHR interoperability, the roles of third-party software vendors, and in particular the new “app store” offered by Epic under the name Orchard. Although Datica supports integration with a dozen EHRs, 70% of their business involves Epic. So we’ll start with the new Orchard initiative.

The Epic App Store

Epic, like most vendors, has offered an API over the past few years that gives programmers at hospitals access to patient data in the EHR. This API now complies with the promising new standard for health data, FHIR, and uses the resources developed by the Argonaut Project. So far, this is all salutary and positive. Dr. Good points out, however, that EHR vendors such as Epic offer the API mostly to extract data. They are reluctant to allow data to be inserted programmatically, claiming it could allow errors into the database. The only change one can make, usually, is to add an annotation.

This seriously hampers the ability of hospitals or third-party vendors to add new value to the clinical experience. Analytics benefit from a read-only data store, but to reach in and improve the doctor’s workflow, an application must be able to write new data into the database.

More risk springs from controls that Epic is putting on the apps uploaded to Orchard. Like the Apple Computer store that inspired Orchard, Epic’s app store vets every app and allows in only the apps that it finds useful. For a while, the terms of service allowed Epic access to the data structures of the app. What this would mean in practice is hard to guess, but it suggests a prurient interest on the part of Epic in what its competitors are doing. We can’t tell where Epic’s thinking is headed, though, because the public link to the terms of service was recently removed, leaving a 404 message.

Good explained that Epic potentially could track all the transactions between the apps and their users, and in particular will know which ones are popular. This raises fears among third-party developers that Epic will adopt their best ideas and crowd them out of the market by adding the features to its own core system, as Microsoft notoriously did during the 1980s when it dominated the consumer software market.

Epic’s obsession with control can be contrasted with the SMART project, an open platform for health data developed by researchers at Harvard Medical School. They too offer an app gallery (not a store), but their goal is to open the repository to as wide a collection of contributors as possible. This maximizes the chances for innovation. As described at one of their conferences, control over quality and fitness for use would be exerted by the administrator of each hospital or other institution using the gallery. This administrator would choose which apps to make available for clinical staff to download.

Of course, SMART apps also work seamlessly cross-platform, which distinguishes them from the apps provided by individual vendors. Eventually–ideally–FHIR support will allow the apps in Orchard and from other vendors to work on all EHRs that support FHIR. But the standard is not firm enough to allow this–there are too many possible variations. People who have followed the course of HITECH implementation know the history of interoperability, and how years of interoperability showcases at HIMSS have been mocked by the real incompatibilities between EHRs out in the field.

To understand how EHRs are making use of APIs, we should look more broadly at their role in health care. That will be the topic of the next section of this article.

Don’t Yell FHIR in a Hospital … Yet

Posted on November 30, 2016 I Written By

The following is a guest blog post by Richard Bagdonas, CTO and Chief Healthcare Architect at MI7.
The Fast Healthcare Interoperability Resource standard, commonly referred to as FHIR (pronounced “fire”) has a lot of people in the healthcare industry hopeful for interoperability between the electronic health record (EHR) systems and external systems — enabling greater information sharing.

As we move into value-based healthcare and away from fee-for-service healthcare, one thing becomes clear: care is no longer siloed to one doctor and most certainly not to one facility. Think of the numerous locations a patient must visit when getting a knee replaced. They start at their general practitioner’s office, then go to the orthopedic surgeon, followed by the radiology center, then to the hospital, often back to the ortho’s office, and finally to one or more physical therapists.

Currently the doctor’s incentives are not aligned with the patient. If the surgery needs to be repeated, the insurance company and patient pay for it again. In the future the doctor will be judged and rewarded or penalized for their performance in what is called the patient’s “episode of care.” All of this coordination between providers requires the parties involved become intimately aware of everything happening at each step in the process.

This all took off back in 2011 when Medicare began an EHR incentive program providing $27B in incentives to doctors at the 5,700 hospitals and 235,000 medical practices to adopt EHR systems. Hospitals would receive $2M and doctors would receive $63,750 when they put in the EHR system and performed some basic functions proving they were using it under what has been termed “Meaningful Use” or MU.

EHR manufacturers made a lot of money selling systems leveraging the MU incentives. The problem most hospitals ran into is their EHR didn’t come with integrations to external systems. Integration is typically done using a 30 year old standard called Health Level 7 or HL7. The EHR can talk to outside systems using HL7, but only if the interface is turned on and both systems use the same version. EHR vendors typically charge thousands of dollars and sometimes tens of thousands to turn on each interface. This is why interface engines have been all the rage since they turn one interface into multiple.

The great part of HL7 is it is standard. The bad parts of HL7 are a) there are 11 standards, b) not all vendors use all standards, c) most EHRs are still using version 2.3 which was released in 1997, and d) each EHR vendor messes up the HL7 standard in their own unique way, causing untold headaches for integration project managers across the country. The joke in the industry is if you have seen one EHR integration, you’ve seen “just one.”

HL7 versions over the years

HL7 version 3.0 which was released in 2005 was supposed to clear up a lot of this integration mess. It used the Extensible Markup Language (XML) to make it easier for software developers to parse the healthcare messages from the EHR, and it had places to stick just about all of the data a modern healthcare system needs for care coordination. Unfortunately HL7 3.0 didn’t take off and many EHRs didn’t build support for it.

FHIR is the new instantiation of HL7 3.0 using JavaScript Object Notation (JSON), and optionally XML, to do similar things using more modern technology concepts such as Representation State Transfer (REST) with HTTP requests to GET, PUT, POST, and DELETE these resources. Developers love JSON.

FHIR is not ready for prime time and based on how HL7 versions have been rolled out over the years it will not be used in a very large percentage of the medical facilities for several years. The problem the FHIR standard created is a method by which a medical facility could port EHR data from one manufacturer to another. EHR manufacturers don’t want to let this happen so it is doubtful they will completely implement FHIR — especially since it is not a requirement of MU.

And FHIR is still not hardened. There have been fifteen versions of FHIR released over the last two years with six incompatible with earlier versions. We are a year away at best from the standard going from draft to release, so plan on there being even more changes.

15 versions of FHIR since 2014 with 6 that are incompatible with earlier versions

Another reason for questioning FHIR’s impact is the standard has several ways to transmit and receive data besides HTTP requests. One EHR may use sockets, while another uses file folder delivery, while another uses HTTP requests. This means the need for integration engines still exists and as such the value from moving to FHIR may be reduced.

Lastly, the implementation of FHIR’s query-able interface means hospitals will have to decide if they must host all of their data in a cloud-based system for outside entities to use or become a massive data center running the numerous servers it will take to allow patients with mobile devices to not take down the EHR when physicians need it for mission-critical use.

While the data geek inside me loves the idea of FHIR, my decades of experience performing healthcare integrations with EHRs tell me there is more smoke than there is FHIR right now.

My best advice when it comes to FHIR is to keep using the technologies you have today and if you are not retired by the time FHIR hits its adoption curve, look at it with fresh eyes at that time. I will be eagerly awaiting its arrival, someday.

About Richard Bagdonas
Richard Bagdonas has over 12 years integrating software with more than 40 electronic health record system brands. He is an expert witness on HL7 and EDI-based medical billing. Richard served as a technical consultant to the US Air Force and Pentagon in the mid-1990’s and authored 4 books on telecom/data network design and engineering. Richard is currently the CTO and Chief Healthcare Architect at MI7, a healthcare integration software company based in Austin, TX.

ONC’s Interoperability Standards Advisory Twitter Chat Summary

Posted on September 2, 2016 I Written By

The following is a guest blog post by Steve Sisko (@ShimCode and

Yesterday the Office of the National Coordinator for Health Information Technology (ONC) hosted an open chat to discuss their DRAFT 2017 Interoperability Standards Advisory (ISA) artifacts.  The chat was moderated by Steven Posnak, Director, Office of Standards and Technology at Office of the National Coordinator for Health Information and used the #ISAchat hashtag under the @HealthIT_Policy account. The @ONC_HealthIT Twitter account also weighed in.

It was encouraging to see that the ONC hosted a tweetchat to share information and solicit feedback and questions from interested parties. After a little bit of a rough start and clarification of the objectives of the chat, the pace of interactions increased and some good information and ideas were exchanged. In addition, some questions were raised; some of which were answered by Steven Posnak and some of which were not addressed.

What’s This All About?

This post summarizes all of the tweets from the #ISAchat. I’ve organized the tweets as best as I could and I’ve excluded re-tweets and most ‘salutatory’ and ‘thank you’ tweets.

Note: The @hitechanswers  account shared a partial summary of the #ISAchat on 8/31/16 but it included less than half of the tweets shared in this post. So you’re getting the complete scoop here.

Topic 1: Tell us about the ISA (Interoperability Standards Advisory)
Account Tweet Time
@gratefull080504 Question: What is the objective of #ISAchat?   12:04:35
@onc_healthit To spread the word and help people better understand what the ISA is about 12:05:00
@gratefull080504 Question: What are today’s objectives, please? 12:08:43
@onc_healthit Our objective is to educate interested parties. Answer questions & hear from the creators 12:11:02
@johnklimek “What’s this I hear about interoperability?” 12:12:00
@cperezmha What is #PPDX? What is #HIE? What is interoperability? What is interface? #providers need to know the differences. Most do not. 12:14:41
@techguy Who is the target audience for these documents? 12:44:06
@healthit_policy HITdevs, CIOs, start-ups, fed/state gov’t prog admins. Those that have a need to align standards 4 use #ISAchat 12:46:18
@ahier No one should have to use proprietary standards to connect to public data #ISAchat 12:46:19
@shimcode Reference Materials on ISA
Ok then, here’s the “2016 Interoperability Standards Advisory”
@shimcode And here’s “Draft 2017 Interoperability Standards Advisory” 12:07:38
@stephenkonya #ICYMI Here’s the link to the @ONC_HealthIT 2017 DRAFT Interoperability Standards Advisory (ISA): 12:10:57
@techguy Question: Do you have a good summary blog post that summarizes what’s available in the ISA? 12:52:15
@onc_healthit We do! Authored by @HealthIT_Policy and Chris Muir – both of whom are in the room for #ISAchat 12:53:15
@healthit_policy Good? – The ISA can help folks better understand what standards are being implemented & at what level 12:06:29
@healthit_policy Getting more detailed compared to prior versions due largely to HITSC & public comments 12:29:48
@healthit_policy More work this fall on our side to make that come to fruition. In future, we’re aiming for a “standards wikipedia” approach 12:33:03
@survivorshipit It would be particularly helpful to include cited full documents to facilitate patient, consumer participation 12:40:22
@davisjamie77 Seeing lots of references to plans to “explore inclusion” of certain data. Will progress updates be provided? 12:50:00
@healthit_policy 1/ Our next milestone will be release of final 2017 ISA in Dec. That will rep’snt full transition to web 12:51:15
@healthit_policy 2/ after that future ISA will be updated more regularly & hopefully with stakeholder involved curation 12:52:21
@bjrstn Topic:  How does the ISA link to the Interoperability Roadmap? 12:51:38
@cnsicorp How will #ISA impact Nationwide Interoperability Roadmap & already established priorities? 12:10:49
@healthit_policy ISA was 1st major deliverable concurrent w/ Roadmap. Will continue to b strong/underlying support to work 12:13:49
@healthit_policy ISA is 1 part of tech & policy section of Roadmap. Helps add transparency & provides common landscape 12:53:55
@healthit_policy Exciting thing for me is the initiated transition from PDF to a web-based/interactive experience w/ ISA 12:30:51
@onc_healthit Web-based version of the ISA can be found here: We welcome comments! 12:32:04
@techguy Little <HSML> From a Participant on the Ease of Consuming ISA Artifacts
So easy to consume!
@healthit_policy If I knew you better I’d sense some sarcasm :) that said, working on better nav approaches too 12:43:36
@techguy You know me well. It’s kind of like the challenge of EHRs. You can only make it so usable given the reqs. 12:45:36
@shimcode I think John forgot to enclose his tweet with <HSML> tags (Hyper Sarcasm Markup Language) 12:46:48
@ahier Don ‘t Use My Toothbrush!
OH (Overheard) at conference “Standards are like toothbrushes, everyone has one and no one wants to use yours”
Topic 2: What makes this ISA different than the previous drafts you have issued?
Account Tweet Time
@cnsicorp #Interoperability for rural communities priority 12:32:40
@healthit_policy Rural, underserved, LTPAC and other pieces of the interoperability puzzle all important #ISAchat 12:35:33
@cnsicorp “more efficient, closer to real-time updates and comments…, hyperlinks to projects…” 12:47:15
@shimcode Question: So you’re not providing any guidance on the implementation of interoperability standards? Hmm… 12:21:10
@gratefull080504 Question: Are implementation pilots planned? 12:22:51
@healthit_policy ISA reflects what’s out there, being used & worked on. Pointer to other resources, especially into future #ISAchat 12:24:10
@ahier The future is here it’s just not evenly distributed (yet) #ISAchat 12:25:15
@healthit_policy Yes, we put out 2 FOAs for High Impact Pilots & Standards Exploration Awards 12:25:56
@healthit_policy HHS Announces $1.5 Million in Funding Opportunities to Advance Common Health Data Standards. Info here:
Topic 3: If you had to pick one of your favorite parts of the ISA, what would it be?
Account Tweet Time
@shimcode The “Responses to Comments Requiring Additional Consideration” section. Helps me understand ONC’s thinking. 12:45:32
@healthit_policy Our aim is to help convey forward trajectory for ISA, as we shift to web, will be easier/efficient engagement 12:47:47
@healthit_policy Depends on sections. Some, like #FHIR, @LOINC, SNOMED-CT are pointed to a bunch. 12:49:15
@gratefull080504 Question: What can patients do to support the objectives of #ISAchat ? 12:07:02
@gratefull080504 Question: Isn’t #ISAChat for patients? Don’t set low expectations for patients 12:10:44
@gratefull080504 I am a patient + I suffer the consequences of lack of #interoperability 12:12:26
@healthit_policy Certainly want that perspective, would love thoughts on how to get more feedback from patients on ISA 12:12:35
@gratefull080504 What about patients? 12:13:03
@gratefull080504 First step is to ensure they have been invited. I am happy to help you after this chat 12:13:57
@survivorshipit Think partly to do w/cascade of knowledge–>as pts know more about tech, better able to advocate 12:15:21
@healthit_policy Open door, numerous oppty for comment, and representation on advisory committees. #MoreTheMerrier 12:15:52
@gratefull080504 I am currently on @ONC_HealthIT Consumer Advisory Task Force Happy to contribute further 12:17:08
@healthit_policy 1 / The ISA is technical in nature, & we haven’t gotten any comments on ISA before from patient groups 12:08:54
@healthit_policy 2/ but as we look to pt generated health data & other examples of bi-directional interop, we’d like to represent those uses in ISA 12:09:51
@resultant TYVM all! Trying to learn all i can about #interoperability & why we’re not making progress patients expect 13:09:22
@shimcode Question: Are use cases being developed in parallel with the Interoperability Standards? 12:13:28
@shimcode Value of standards don’t lie in level of adoption of std as a whole, but rather in implementation for a particular use case. 12:16:33
@healthit_policy We are trying to represent broader uses at this point in the “interoperability need” framing in ISA 12:18:58
@healthit_policy 2/ would be great into the future to have more detailed use case -> interop standards in the ISA with details 12:19:49
@healthit_policy Indeed, royal we will learn a lot from “doing” 12:20:40
@shimcode IHE Profiles provide a common language to discuss integration needs of healthcare sites and… Info here: 12:29:12
@techguy I’d love to see them take 1 section (say allergies) and translate where we’d see the standards in the wild. 12:59:04
@techguy Or some example use cases where people want to implement a standard and how to use ISA to guide it. 13:00:38
@healthit_policy Check out links now in ISA to the Interop Proving Ground – projects using #ISAchat standards. Info here: 13:02:54
@healthit_policy Thx for feedback, agree on need to translate from ISA to people seeing standards implemented in real life 13:01:08
@healthit_policy Commenting on ISA Artifacts
We want to make the #ISA more accessible, available, and update-able to be more current compared to 1x/yr publication
@cperezmha #interoperability lowers cost and shows better outcomes changing the culture of healthcare to be tech savvy is key 12:35:10
@healthit_policy One new feature we want to add to web ISA is citation ability to help document what’s happ’n with standards 12:37:12
@shimcode A “discussion forum” mechanism where individual aspects can be discussed & rated would be good. 12:39:53
@healthit_policy Good feedback. We’re looking at that kind of approach as an option. ISA will hopefully prompt debate 12:40:50
@shimcode Having to scroll through all those PDF’s and then open them 1 by 1 only to have to scroll some more is VERY inefficient. 12:41:25
@shimcode Well, I wouldn’t look/think too long about it. Adding that capability is ‘cheap’ & can make it way easier on all. 12:43:48
@shimcode Question: What Can Be Learned About Interoperability from the Private Sector?
Maybe @ONC_HealthIT can get input from Apple’s latest #healthIT purchase/Gliimpse? What do they know of interoperability?
@healthit_policy > interest from big tech cos and more mainstream awareness is good + more innovation Apple iOS has CCDA sprt 12:22:59
@drewivan Testing & Tools
I haven’t had time to count, but does anyone know approximately how many different standards are included in the document?
@healthit_policy Don’t know stat off had, but we do identify and provide links for test tools as available. 12:56:31
@drewivan And what percentage of them have test tools available? 12:54:38
@shimcode According to the 2017 ISA stds just released, a tiny fraction of them have test tools. See here: 12:58:02
@shimcode I take back “tiny faction” comment on test tools. I count 92 don’t have test tools, 46 do. No assessment of tool quality though. 13:08:31
@healthit_policy Testing def an area for pub-private improvement, would love to see # increase, with freely available too 12:59:10
@techguy A topic near and dear to @interopguy’s heart! 12:59:54
@resultant Perhaps we could replace a couple days of HIMSS one year with #interoperability testing? #OutsideBox 13:02:30
Walk on Topic: Promotion of ISA (Thank you @cperezmha)
What can HIE clinics do to help other non-users get on board? Is there a certain resource we should point them too to implement?
Account Tweet Time
@davisjamie77 Liking the idea of an interactive resource library. How will you promote it to grow use? 12:35:57
@healthit_policy A tweetchat of course! ;) Also web ISA now linking to projects in the Interoperability Proving Ground 12:39:04
@davisjamie77 Lol! Of course! Just seeing if RECs, HIEs, other #HIT programs might help promote. 12:40:44
@healthit_policy Exactly… opportunities to use existing relationships and comm channels ONC has to spread the word 12:41:28
@stephenkonya Question: How can we better align public vs private #healthcare delivery systems through #interoperability standards? 12:42:23
Miscellaneous Feedback from Participants
Account Tweet Time
@ahier Restful APIs & using JSON and other modern technologies 12:54:03
@waynekubick Wayne Kubick joining from #HL7 anxious to hear how #FHIR and #CCDA can help further advance #interoperability. 12:11:30
@resultant We all do! The great fail of #MU was that we spent $38B and did not get #interoperability 12:14:21
@waynekubick SMART on #FHIR can help patients access and gain insights from their own health data — and share it with care providers. 12:17:44
@resultant I think throwing money at it is the only solution… IMHO providers are not going to move to do it on their own… 12:20:44
@shimcode @Search_E_O your automatic RT’s of the #ISAChat tweets are just clouding up the stream. Why? smh 12:08:30
Do you see #blockchain making it into future ISA
@healthit_policy Phew… toughy. lots of potential directions for it. Going to segue my response into T2 12:28:58
@hitpol #blockchain for healthcare! ➡ @ONC_HealthIT blockchain challenge. Info here: 12:31:33
@healthit_policy That’s All Folks!
Thank you everyone for joining our #ISAchat! Don’t forget to leave comments.
PDF version

About Steve Sisko
Steve Sisko has over 20 years of experience in the healthcare industry and is a consultant focused on healthcare data, technology and services – mainly for health plans, payers and risk-bearing providers. Steve is known as @ShimCode on Twitter and runs a blog at You can learn more about Steve at his LinkedIn page and he can be contacted at

Schlag and Froth: Argonauts Navigate Between Heavy-weight and Light-weight Standardization (Part 2 of 2)

Posted on August 26, 2016 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site ( and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

The previous section of this article laid out the context for HL7 FHIR standard and the Argonaut project; now we can look at the current status.

The fruits of Argonaut are to be implementation guides that they will encourage all EHR vendors to work from. These guides, covering a common clinical data set that has been defined by the ONC (and hopefully will not change soon), are designed to help vendors achieve certification so they can sell their products with the assurance that doctors using them will meet ONC regulations, which require a consumer-facing API. The ONC will also find certification easier if most vendors claim adherance to a single unambiguous standard.

The Argonaut implementation guides, according to Tripathi, will be complete in late September. Because FHIR is expected to be passed in September 2017, the Argonaut project will continue to refine and test the guides. One guide already completed by the project covers security authorization using OpenID and OAuth. FHIR left the question of security up to those standards, because they are well-established and already exist in thousands of implementations around the Web.

Achieving rough consensus

Tripathi portrays the Argonaut process as radically different from HL7 norms. HL7 has established its leading role in health standards by following the rules of the American National Standards Institute (ANSI) in the US, and similar bodies set up in other countries where HL7 operates. These come from the pre-Internet era and emphasize ponderous, procedure-laden formalities. Meetings must be held, drafts circulated, comments explicitly reconciled, ballots taken. Historically this has ensured that large industries play fair and hear through all objections, but the process is slow and frustrates smaller actors who may have good ideas but lack the resources to participate.

In contrast, FHIR brings together engineers and other interested persons in loose forums that self-organize around issues of interest. The process still tried to consider every observation and objection, and therefore, as we have seen, has taken a long time. But decision-making takes place at Internet speed and there is no jockeying for advantage in the marketplace. Only when a milestone is reached does the formal HL7 process kick in.

The Argonaut project works similarly. Tripathi reports that the vendors have gotten along very well. Epic and Cerner, the behemoths of the EHR field, are among the most engaged. Company managers don’t interfere with engineer’s opinions. And new vendors with limited resources are very active.

Those with a background in computers can recognize, in these modes of collaboration, the model set up by the Internet Engineering Task Force (IETF) decades ago. Like HL7, the IETF essentially pre-dated the Internet as we know it, which they helped to design. (The birth of the Internet is usually ascribed to 1969, and the IETF started in 1986, at an early stage of the Internet. FTP was the canonical method of exchanging their plain-text documents with ASCII art, and standards were distributed as Requests for Comments or RFCs.) The famous criteria cited by the IETF for approving standards is “rough consensus and running code.” FHIR and the Argonauts produce no running code, but they seem to operate through rough consensus, and the Argonauts could add a third criterion, “Get the most important 90% done and don’t let the rest hold you up.”

Tripathi reports that EHR vendors are now collaborating in this same non-rivalrous manner in other areas, including the Precision Medicine initiative, the Health Services Platform Consortium (HSPC), and the SMART on FHIR initiative.

What Next?

The dream of interoperability has long included the dream of a marketplace for apps, so that we’re not stuck with the universally hated EHR interfaces that clinicians struggle with daily, or awkwardly designed web sites for consumers. Tripathi notes that SMART offers an app gallery with applications that ought to work on any EHR that conforms to the open SMART platform. Cerner and athenahealth also have app stores protected by a formal approval process. (Health apps present more risk than the typical apps in the Apple App Store or Google Play, so they call more more careful, professional vetting.) Tripathi is certain that other vendors will follow in the lead of these projects, and that cross-vendor stores like SMART’s App Gallery will emerge in a few years along with something like a Good Housekeeping seal for apps.

The Argonaut guides will have to evolve. It’s already clear that EHR vendors are doing things that aren’t covered by the Argonaut FHIR guide, so there will be a few incompatible endpoints in their APIs. Consequently, the Argonaut project has a big decision to make: how to provide continuity? The project was deliberately pitched to vendors as a one-time, lightweight initiative. It is not a legal entity, and it does not have a long-term plan for stewardship of the outcomes.

The conversation over continuity is ongoing. One obvious option is to turn over everything to HL7 and let the guides fall under its traditional process. A new organization could also be set up. HL7 itself has set up the FHIR Foundation under a looser charter than HL7, probably (in my opinion) because HL7 realizes it is not nimble and responsive enough for the FHIR community.

Industries reach a standard in many different ways. In health care, even though the field is narrow, standards present tough challenges because of legacy issues, concerns over safety, and the complexity of human disease. It seems in this case that a blend of standardization processes has nudged forward a difficult process. Over the upcoming year, we should know how well it worked.

Schlag and Froth: Argonauts Navigate Between Heavy-weight and Light-weight Standardization (Part 1 of 2)

Posted on August 25, 2016 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site ( and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

You generally have to dwell in deep Nerdville to get up much excitement about technical standards. But one standard has been eagerly followed by thousands since it first reached the public eye in 2012: Fast Healthcare Interoperability Resources (FHIR). To health care reformers, FHIR embodies all the values and technical approaches they have found missing in health care for years. And the development process for FHIR is as unusual in health care as the role the standard is hoped to play.

Reform From an Unusual Corner

FHIR started not as an industry initiative but as a pet project of Australian Grahame Grieve and a few developers gathered around him. From this unusual genesis it got taken up by HL7 and an initial draft was released in March 2012. Everybody in health care reform rallied around FHIR, recognizing it as a viable solution to the long-stated need for application programming interfaces (APIs). The magic of APIs, in turn, is their potential to make data exchange easy and create a platform for innovative health care applications that need access to patient data.

So, as a solution to the interoperability problems for which EHR vendors had been dunned by users and the US government, FHIR won immediate accolades. But these vendors knew they couldn’t trust normal software adoption processes to use FHIR interoperably–those processes had already failed on earlier standards.

HL7 version 2 had duly undergone a long approval process and had been implemented as an output document format by numerous EHR vendors, who would show off their work annually at an Interoperability Showcase in a central hall of the HIMSS conference. Yet all that time, out in the field, innumerable problems were reported. These failures are not just technical glitches, but contribute to serious setbacks in health care reform. For instance, complaints from Accountable Care Organizations are perennial.

Congress’s recent MACRA bill, follow-up HHS regulations, and pronouncements from government leaders make it clear that hospitals and their suppliers won’t be off the hook till they solve this problem of data exchange, which was licked decades ago by most other industries. It was by dire necessity, therefore, that an impressive array of well-known EHR vendors announced the maverick Argonaut project in December 2014. (I don’t suppose its name bears any relation to the release a few months before of a highly-publicized report from a short-lived committee called JASON.)

Argonaut include major EHR vendors, health care providers such as Partners Healthcare, Mayo, Intermountain, and Beth Israel Deaconess, and other interested parties such as Surescripts, The Advisory Board, and Accenture. Government agencies, especially the ONC, and app developers have come on board as testers.

One of the leading Argonauts is Micky Tripathi, CEO of the Massachusetts eHealth Collaborative. Tripathi has been involved in health care reform and technical problems such as data exchange long before these achieved notable public attention with the 2009 HITECH act. I had a chance to talk to him this week about the Argonauts’ progress.

Reaching a Milestone

FHIR is large and far-reaching but deliberately open-ended. Many details are expected to vary from country to country and industry to industry, and thus are left up to extensions that various players will design later. It is precisely in the extensions that the risk lurks of reproducing the Tower of Babel that exists in other health care standards.

The reason the industry have good hopes for success this time is the unusual way in which the Argonaut project was limited in both time and scope. It was not supposed to cover the entire health field, as standards such as the International Classification of Diseases (ICD) try to do. It would instead harmonize the 90% of cases seen most often in the US. For instance, instead of specifying a standard of 10,000 codes, it might pick out the 500 that the doctor is most likely to see. Instead of covering all the ways to take a patient’s blood pressure (sitting, standing, etc.), it recommends a single way. And it sticks closely to clinical needs, although it may well be extended for other uses such as pharma or Precision Medicine.

Finally instead of staying around forever to keep chopping off more tasks to solve, the Argonaut project would go away when it was done. In fact, it was supposed to be completed one year ago. But FHIR has taken longer than expected to coalesce, and in the meantime, the Argonaut project has been recognized as a fertile organization by the vendors. So they have extended it to deal with some extra tasks, such as an implementation guide for provider directories, and testing sprints.

That’s some history; the next section of this article will talk about the fruits of the Argonaut project and their plans for the future.

If MACRA Fails, It Will Be a Failure of IT, Not Doctors or Regulators

Posted on August 8, 2016 I Written By

The following is a guest blog by Steve Daniels, president of Able Health.

There has been a whole lot of mudslinging over the last month between regulators and healthcare providers over MACRA, which shifts Medicare payments further toward pay-for-performance starting January 1. On the one hand, CMS Acting Administrator Andy Slavitt is clear that CMS is ready for change. “We need to get out of the mode of paying physicians just to run tests and prescribe medicines,” he told a Senate Finance Committee hearing. Meanwhile, Dr. Thomas Eppes of the American Medical Association has called MACRA a “quantum shift” and pushed for a delay.

Yes, the Medicare Quality Payment Program instituted by MACRA should—and will—evolve based on comments made on the proposed rule. But the reality is the program provides enormous opportunity for providers to increase bonus payments, while streamlining reporting requirements across a patchwork of outdated and duplicative programs. And it’s worth noting that the potential penalties under the Merit-Based Incentive Payment System (MIPS) over the next four years are actually lower than the sum of the penalties of the programs it is replacing.

To meet MACRA goals, it will take a well-prepared team of providers and administrators—empowered by data and well-designed tools. Doctors can’t be solely responsible for achieving patient outcomes, reducing costs and documenting it all for CMS as they go. Unfortunately, the history of health IT has not been kind—or affordable—to doctors. And today, the health IT stack has a new challenge—keeping pace with the proliferation of value-based programs, from accessing data all the way through enabling new clinical practice.

We must move from a mindset of meeting Meaningful Use checkboxes toward supporting a more effective way of operating. And in the modern world of software-as-as-service, there’s no good reason left that IT needs to cost providers millions of dollars. We can do better. As things stand, if MACRA fails, it will be a failure of IT, not doctors or regulators.

Gathering all the data

For value-based care to work, patient data needs to be made available for providers to coordinate with each other, as well as to payers, to properly evaluate performance based on all known information. Those still blocking or jacking up prices for data access are complicit in obstructing the vision of a learning value-based system.

It is time to remove technical barriers through modern and open data standards like FHIR, as well as rules and unreasonable fees that prevent parties from accessing data when they need it. Thankfully, the Advancing Care Information performance category will reflect the emphasis on information exchange set forth in Meaningful Use Stage 3.

Calculating performance flexibly

The new era of performance-based pay requires continuous monitoring of quality and cost, with the ability to track progress across multiple programs on an ongoing basis. To measure quality today, we often use static algorithms hard-coded by EHRs vendors and health system IT departments, conforming to standards set by NCQA or CMS.

But providers need tools that are tailored not just to one or two programs like Meaningful Use and PQRS, but across the organization’s full range of value-based programs as these program continue to expand, evolve, and proliferate. With efforts to standardize IT for quality measures stalling, vendors need to focus less on one-size-fits-all quality measure calculations and more on flexible systems that enable measures to be rapidly constructed and customized to move with the trends. Expect change to be the norm.

Informing new behaviors

With so many health IT professionals focused on gathering and reporting data, it is not surprising that design has taken a back seat so far. But this year, not a single population health vendor earned an “A” rating from Chilmark, due to poor user engagement and clinical workflow. This is no longer acceptable. The challenge of enabling the new clinical and administrative behaviors associated with value-based care is too vast. User experience must be top of mind for any IT implementation, with representative users involved from the start. We have seen the impact of poor user experience in the fee-for-service system, from frustrated clinicians to alarming patient safety issues.

Design is even more important when the challenge is not just documenting billing codes but also achieving health outcomes for patients across a care team. Don’t bombard clinicians with notifications and force clumsy form-filling. Instead, employ best practices from cognitive psychology to inform professionals with lightweight and intelligent touchpoints. Automate documentation and interpretation of data wherever possible.

A new era of health IT

Whether or not it’s delayed, the Quality Payment Program is coming. And the healthcare industry is moving inexorably toward value-based care. Will health IT step up to the challenge of building toward a value-based future that is accessible to all providers? Or will we sit back and wait for the next list of requirements?

About Steve Daniels
Steve Daniels is the President of Able Health, which helps providers succeed under MACRA and value-based programs. Formerly the design lead for IBM Watson for healthcare and a lifelong patient advocate, he is passionate about the role of open data exchange and intuitive experience design in fostering a continuously improving healthcare system. Find him on Twitter and LinkedIn.