Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

6 Healthcare Interoperability Myths

Posted on February 9, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

With my new fascination with healthcare interoperability, I’m drawn to anything and everything which looks at the successes and challenges associated with it. So, it was no surprised that I was intrigued by this whitepaper that looks at the 6 Healthcare Interoperability Myths.

For those who don’t want to download the whitepaper for all the nitty gritty details, here are the 6 myths:

  1. One Size Fits All
  2. There Is One Standard to Live By
  3. I Can Only “Talk” to Providers on the Same EHR as Mine
  4. If I Give Up Control of My Data, I’ll Lose Patients
  5. Hospitals Lead in Interoperability
  6. Interoperability Doesn’t Really “Do” Anything. It’s Just a Fad like HMOs in the 90s

You can read the whole whitepaper if you want to read all the details about each myth.

The first two hit home to me and remind me of my post about achieving continuous healthcare interoperability. I really think that the idea of every health IT vendor “interpreting” the standard differently is an important concept that needs to be dealt with if we want to see healthcare interoperability happen.

Another concept I’ve been chewing on is whether everyone believes that healthcare interoperability is the right path forward. The above mentioned whitepaper starts off with a strong statement that, “It’s no tall tale. Yes. We need interoperability.” While this is something I believe strongly, I’m not sure that everyone in healthcare agrees.

I’d love to hear your thoughts. Do we all want healthcare interoperability or are there are a lot of people out there that aren’t sure if healthcare interoperability is the right way forward?

How Do We Achieve Continuous Healthcare Interoperability?

Posted on February 2, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Today I had a really interesting chat about healthcare interoperability with Mario Hyland, Founder of AEGIS. I’m looking at a number of ways that Mario and I can work together to move healthcare interoperability forward. We’ll probably start with a video hangout with Mario and then expand from there.

Until then, I was struck by something Mario said in our conversation: “Healthcare interoperability is not a point in time. You can be interoperable today and then not be tomorrow.

This really resonated with me and no doubt resonates with doctors and hospitals who have an interface with some other medical organization. You know how easy it is for your interface to break. It’s never intentional, but these software are so large and complex that someone will make a change and not realize the impact that change will have across all your connections. As I wrote on Hospital EMR and EHR, API’s are Hard!

Currently we don’t even have a bunch of complex APIs with hundreds of companies connecting to the EHR. We’re lucky if an EHR has a lab interface, ePrescribing, maybe a radiology interface, and maybe a connection to a local hospital. Now imagine the issues that crop up when you’re connecting to hundreds of companies and systems. Mario was right when he told me, “Healthcare thinks we’re dealing with the complex challenges of healthcare interoperability. Healthcare doesn’t know the interoperability challenges that still wait for them and they’re so much more complex than what we’re dealing with today.”

I don’t want to say this as discouragement, but it should encourage us to be really thoughtful about how we handle healthcare interoperability so it can scale up. The title of this post asks a tough question that isn’t being solved by our current one time approach to certification. How do we achieve continuous healthcare interoperability that won’t break on the next upgrade cycle?

I asked Mario why the current EHR certification process hasn’t been able to solve this problem and he said that current EHR certification is more of a one time visual inspection of interoperability. Unfortunately it doesn’t include a single testing platform that really tests an EHR against a specific interoperability standard, let alone ongoing tests to make sure that any changes to the EHR don’t affect future interoperability.

I’ve long chewed on why it is that we can have a “standard” for interoperability, but unfortunately that doesn’t mean that EHR systems can actually interoperate. I’ve heard people tell me that there are flavors of the standard and each organization has a different flavor. I’ve seen this, but what I’ve found is that there are different interpretations of the same standard. When you dig into the details of any standard, you can see how it’s easy for an organization to interpret a standard multiple ways.

In my post API’s are Hard, the article that is linked talks about the written promise and the behavioral promise of an API. The same thing applies to a healthcare interoperability standard. There’s the documented standard (written promise), and then there’s the way the EHR implements the standard (behavioral promise).

In the API world, one company creates the API and so you have one behavioral promise to those who use it. Even with one company, tracking the behavioral promise can be a challenge. In the EHR world, each EHR vendor has implemented interoperability according to their own interpretation of the standard and so there are 300+ behavioral promises that have to be tracked and considered. One from each company and heaven help us if and when a company changes that behavioral promise. It’s impossible to keep up with and explains one reason why healthcare intoperability isn’t a reality today.

What’s the solution? One solution is to create a set of standard test scripts that can be tested against by any EHR vendor on an ongoing basis. This way any EHR vendor can test the interoperability functionality of their application throughout their development cycle. Ideally these test scripts would be offered in an open source manner which would allow multiple contributors to continue to provide feedback and improve the test scripts as errors in the test scripts are found. Yes, it’s just the nature of any standard and testing of that standard that exceptions and errors will be found that need to be addressed and improved.

I mentioned that I was really interested in diving in deeper to healthcare interoperability. I still have a lot more deeper to go, but consider this the first toe dip into the healthcare interoperability waters. I really want to see this problem solved.

Speeding Sepsis Response by Integrating Key Technology

Posted on January 26, 2015 I Written By

Stephen Claypool, M.D., is Vice President of Clinical Development & Informatics, Clinical Solutions, with Wolters Kluwer Health and Medical Director of its Innovation Lab. He can be reached at steve.claypool@wolterskluwer.com.
Stephen Claypool - WKH
Three-week-old Jose Carlos Romero-Herrera was rushed to the ER, lethargic and unresponsive with a fever of 102.3. His mother watched helplessly as doctors, nurses, respiratory therapists and assorted other clinicians frantically worked to determine what was wrong with an infant who just 24 hours earlier had been healthy and happy.

Hours later, Jose was transferred to the PICU where his heart rate remained extremely high and his blood pressure dangerously low. He was intubated and on a ventilator. Seizures started. Blood, platelets, plasma, IVs, and multiple antibiotics were given. Still, Jose hovered near death.

CT scans, hourly blood draws and EEGs brought no answers. Despite all the data and knowledge available to the clinical team fighting for Jose’s life, it was two days before the word “sepsis” was uttered. By then, his tiny body was in septic shock. It had swelled to four times the normal size. The baby was switched from a ventilator to an oscillator. He received approximately 16 different IV antibiotics, along with platelets, blood, plasma, seizure medications and diuretics.

“My husband and I were overwhelmed at the equipment in the room for such a tiny little person. We were still in shock about how we’d just sat there and enjoyed him a few hours ago and now were being told that we may not be bringing him back home with us,” writes Jose’s mother, Edna, who shared the story of her baby’s 30-day ordeal as part of the Sepsis Alliance’s “Faces of Sepsis” series.

Jose ultimately survived. Many do not. Three-year-old Ivy Hayes went into septic shock and died after being sent home from the ER with antibiotics for a UTI. Larry Przybylski’s mother died just days after complaining of a “chill” that she suspected was nothing more than a 24-hour bug.

Sepsis is the body’s overwhelming, often-fatal immune response to infection. Worldwide, there are an estimated 8 million deaths from sepsis, including 750,000 in the U.S. At $20 billion annually, sepsis is the single most expensive condition treated in U.S. hospitals.

Hampering Efforts to Fight Sepsis

Two overarching issues hamper efforts to drive down sepsis mortality and severity rates.

First, awareness among the general population is surprisingly low. A recent study conducted by The Harris Poll on behalf of Sepsis Alliance found that just 44% of Americans had ever even heard of sepsis.

Second, the initial presentation of sepsis can be subtle and its common signs and symptoms are shared by multiple other illnesses. Therefore, along with clinical acumen, early detection requires the ability to integrate and track multiple data points from multiple sources—something many hospitals cannot deliver due to disparate systems and siloed data.

While the Sepsis Alliance focuses on awareness through campaigns including Faces of Sepsis and Sepsis Awareness Month, hospitals and health IT firms are focused on reducing rates by arming clinicians with the tools necessary to rapidly diagnose and treat sepsis at its earliest stages.

A primary clinical challenge is that sepsis escalates rapidly, leading to organ failure and septic shock, resulting in death in nearly 30 percent of patients. Every hour without treatment significantly raises the risk of death, yet early screening is problematic. Though much of the data needed to diagnose sepsis already reside within EHRs, most systems don’t have the necessary clinical decision support content or informatics functionality.

There are also workflow issues. Inadequate cross-shift communication, challenges in diagnosing sepsis in lower-acuity areas, limited financial resources and a lack of sepsis protocols and sepsis-specific quality metrics all contribute to this intractable issue.

Multiple Attack Points

Recognizing the need to attack sepsis from multiple angles, our company is testing a promising breakthrough in the form of POC Advisor™. The program is a holistic approach that integrates advanced technology with clinical change management to prevent the cascade of adverse events that occur when sepsis treatment is delayed.

This comprehensive platform is currently being piloted at Huntsville Hospital in Alabama and John Muir Medical Center in California. It works by leveraging EHR data and automated surveillance, clinical content and a rules engine driven by proprietary algorithms to begin the sepsis evaluation process. Mobile technology alerts clinical staff to evaluate potentially septic patients and determine a course of treatment based on their best clinical judgment.

For a truly comprehensive solution, it is necessary to evaluate specific needs at each hospital. That information is used to expand sepsis protocols and add rules, often hundreds of them, to improve sensitivity and specificity and reduce alert fatigue by assessing sepsis in complex clinical settings. These additional rules take into account comorbid medical conditions and medications that can cause lab abnormalities that may mimic sepsis. This helps to ensure alerts truly represent sepsis.

The quality of these alerts is crucial to clinical adoption. They must be both highly specific and highly sensitive in order to minimize alert fatigue. In the case of this specific system, a 95% specificity and sensitivity rating has been achieved by constructing hundreds of variations of sepsis rules. For example, completely different rules are run for patients with liver disease versus those with end-stage renal disease. Doing so ensures clinicians only get alerts that are helpful.

Alerts are also coupled with the best evidence-based recommendations so the clinical staff can decide which treatment path is most appropriate for a specific patient.

The Human Element

To address the human elements impacting sepsis rates, the system in place includes clinical change management to develop best practices, including provider education and screening tools and protocols for early sepsis detection. Enhanced data analytics further manage protocol compliance, public reporting requirements and real-time data reporting, which supports system-wide best practices and performance improvement.

At John Muir, the staff implemented POC Advisor within two medical/surgical units for patients with chronic kidney disease and for oncology patient populations. Four MEDITECH interfaces sent data to the platform, including lab results, pharmacy orders, Admit Discharge Transfer (ADT) and vitals/nursing documentation. A clinical database was created from these feeds, and rules were applied to create the appropriate alerts.

Nurses received alerts on a VoIP phone and then logged into the solution to review the specifics and determine whether they agree with the alerts based on their clinical training. The system prompted the nursing staff to respond to each one, either through acknowledgement or override. If acknowledged, suggested guidance regarding the appropriate next steps was provided, such as alerting the physician or ordering diagnostic lactate tests, based on the facility’s specific protocols. If alerts were overridden, a reason had to be entered, all of which were logged, monitored and reported. If action was not taken, repeat alerts were fired, typically within 10 minutes. If repeat alerts were not acted upon, they were escalated to supervising personnel.

Over the course of the pilot, the entire John Muir organization benefited from significant improvements on several fronts:

  • Nurses were able to see how data entered into the EHR was used to generate alerts
  • Data could be tracked to identify clinical process problems
  • Access to clinical data empowered the quality review team
  • Nurses reported being more comfortable communicating quickly with physicians based on guidance from the system and from John Muir’s standing policies

Finally, physicians reported higher confidence in the validity of information relayed to them by the nursing staff because they knew it was being communicated based on agreed upon protocols.

Within three months, John Muir experienced significant improvements related to key sepsis compliance rate metrics. These included an 80% compliance with patient screening protocols, 90% lactate tests ordered for patients who met screening criteria and 75% initiation of early, goal-directed therapy for patients with severe sepsis.

Early data from Huntsville Hospital is equally promising, including a 37% decline in mortality on patient floors where POC Advisor was implemented. Thirty-day readmissions have declined by 22% on screening floors, and data suggest documentation improvements resulting from the program may positively impact reimbursement levels.

This kind of immediate outcome is generating excitement at the pilot hospitals. Though greater data analysis is still necessary, early indications are that a multi-faceted approach to sepsis holds great promise for reducing deaths and severity.

Defining the Legal Health Record, Ensuring Quality Health Data, and Managing a Part-Paper Part-Electronic Record – Healthcare Information Governance

Posted on January 20, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

This post is part of Iron Mountain’s Healthcare Information Governance: Big Picture Predictions and Perspectives Series which looks at the key trends impacting Healthcare Information Governance. Be sure to check out all the entries in this series.

Healthcare information governance (IG) has been important ever since doctors started tracking their patients in paper charts. However, over the past few years, adoption of EHR and other healthcare IT systems has exploded and provided a myriad of new opportunities and challenges associated with governance of a healthcare organization’s information.

Three of the most important health information governance challenges are:
1. Defining the legal health record
2. Ensuring quality health data
3. Managing a part-paper, part-electronic record

Defining the Legal Health Record
In the paper chart world, defining the legal health record was much easier. As we’ve shifted to an electronic world, the volume of data that’s stored in these electronic systems is so much greater. This has created a major need to define what your organization considers the legal health record.

The reality is that each organization now has to define its own legal health record based on CMS and accreditation guidelines, but also based on the specifics of their operation (state laws, EHR options, number of health IT systems, etc). The legal health record will only be a subset of the data that’s being stored by an EHR or other IT system and you’ll need to involve a wide group of people from your organization to define the legal health record.

Doing so is going to become increasingly important. Without a clearly defined legal health record, you’re going to produce an inconsistent release of information. This can lead to major liability issues in court cases where you produce inconsistent records, but it’s also important to be consistent when releasing health information to other doctors or even auditors.

One challenge we face in this regard is ensuring that EHR vendors provide a consistent and usable data output. A lot of thought has been put into how data is inputted into the EHR, but not nearly as much effort has been put into the way an EHR outputs that data. This is a major health information governance challenge that needs to be addressed. Similarly, most EHR vendors haven’t put much thought and effort into data retention either. Retention policies are an important part of defining your legal health record, but your policy is subject to the capabilities of the EHR.

Working with your EHR and other healthcare IT vendors to ensure they can produce a consistent legal health record is one strategic imperative that every healthcare organization should have on their list.

Ensuring Quality Health Data
The future of healthcare is very much going to be data driven. Payments to ACO organizations are going to depend on data. The quality of care you provide using Clinical Decision Support (CDS) systems is going to rely on the quality of data being used. Organizations are going to have new liability concerns that revolve around their organization’s data quality. Real time data interoperability is going to become a reality and everyone’s going to see everyone else’s data without a middleman first checking and verifying the quality of the data before it’s sent.

A great health information governance program led by a clinical documentation improvement (CDI) program is going to be a key first step for every organization. Quality data doesn’t happen over night, but requires a concerted effort over time. Organization need to start now if they want to be successful in the coming data driven healthcare world.

Managing a Part-Paper Part-Electronic Record
The health information world is becoming infinitely more complex. Not only do you have new electronic systems that store massive amounts of data, but we’re still required to maintain legacy systems and those old paper charts. Each of these requires time and attention to manage properly.

While we’d all love to just turn off legacy systems and dispose of old paper charts, data retention laws often mean that both of these will be part of every healthcare organization for many years to come. Unfortunately, most health IT project plans don’t account for ongoing management of these old but important data sources. This inattention often results in increased costs and risks associated with these legacy systems and paper charts.

It should be strategically important for every organization to have a sound governance plan for both legacy IT systems and paper charts. Ignorance is not bliss when one of these information sources is breached because your organization had “forgotten” about them.

The future of reimbursement, costs, quality of care, and liability in healthcare are all going to be linked to an organization’s data. Making sure your data governance house is in order is going to be a major component in the success or failure of your organization. A good place to start is defining the legal health record, ensuring quality health data, and managing a part-paper part-electronic record.

Join our Twitter Chat: “Healthcare IG Predictions & Perspectives”

On January 28th at 12:00 pm Eastern, @IronMtnHealth is hosting a Twitter chat using #InfoTalk to further the dialog. If you have been involved in governance-related projects, we’d love to have you join. What IG initiatives have shown success for you? How have you overcome any obstacles? What do you see as the future of IG? Keep the conversation going during our “Healthcare IG Predictions & Perspectives” #InfoTalk at 12pm Eastern on January 28th.

The Value of an Integrated Specialty EHR Approach

Posted on January 19, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

As many of you know, I’ve long been an advocate for the specialty specific EHR. There are just tremendous advantages in having an EHR that’s focused only on your specialty. Then, you don’t get things like child growth charts cluttering your EHR when you don’t see any children. Or taken the other way, you have child growth charts that are designed specifically for a pediatrician. This can be applied across pretty much every industry.

The reason that many organizations don’t go with a specialty specific EHR is usually because they’re a large multi specialty organization. These organizations don’t want to have 30 different EHR vendors that they have to support. Therefore, in their RFP they basically exclude specialty specific EHR vendors from their EHR selection process.

I understand from an IT support perspective and EHR implementation perspective how having 30 different EHR implementation would be a major challenge. However, it’s also a challenge to try and get one EHR vendor to work for 30+ specialties as well. Plus, the long term consequence is physician and other EHR user dissatisfaction using an EHR that wasn’t designed for their specialty. The real decision these organizations are making is whether they want to put the burden on the IT staff (ie. supporting multiple EHRs) or whether they want to put the burden on the doctors (ie. using an EHR that doesn’t meet their needs). In large organizations, it seems that they’re making the decision to put the burden on the doctors as opposed to the IT staff. Although, I don’t think many organizations realize that this is the choice they’re making.

Specialty EHR vendor, gMed, recenlty put out a whitepaper which does an analysis and a kind of case study on the differences between a integrated GI practice and a non-integrated GI practice. In this case, they’re talking about an EHR that’s integrated with an ambulatory surgery center and one that’s not. That’s a big deal for a specialty like GI. You can download the free whitepaper to get all the juicy details and differences between an integrated GI practice and one that’s not.

I’ve been seeing more and more doctors starting to talk about their displeasure with their EHR. I think much of that displeasure comes thanks to meaningful use and reimbursement requirements, but I also think that many are suffering under an EHR that really doesn’t understand their specialty. From my experience those EHR vendors that claim to support every specialty, that usually consists of one support rep for that specialty and a few months programming sprint to try and provide something special for that specialty. That’s very different than a whole team of developers and every customer support person at the company devoted to a specialty.

I’m not saying that an EHR can’t do more than one specialty, but doing 5 somewhat related specialties is still very different than trying to do the 40+ medical specialties with one interface. One challenge with the best of breed approach is that there are some specialties which don’t have an EHR that’s focused just on them. In that case, you may have to use the every specialty EHR.

What’s clear to me is that most large multi specialty organizations are choosing the all-in-one EHR systems in their offices. I wonder if force feeding an EHR into a specialty where it doesn’t fit is going to eventually lead to a physician revolt back to specialty specific EHRs. Physician dissatisfaction, liability issues, and improved interoperability could make the best of breed approach much more attractive to even the large organizations. Even if it means they back into a best of breed approach after trying the one-size-fits all approach to EHR.

I’ll be interested to watch this dynamic playing out. Plus, you have the specialty doctors coming together in mega groups in order to combat against this as well. What do you think is going to happen with specialty EHR? Should organizations be doing a best of breed approach or the one-size-fits all EHR? What are the consequences (good and bad) of either direction?

Full Disclosure: gMed is an advertiser on this site.

Congress Asks ONC to Decertify EHRs That Proactively Block Information Sharing

Posted on December 22, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

A big thanks to A. Akhter, MD for pointing out the 2014 Omnibus Appropriations bill (word is in Washington they’re calling it the CRomnibus bill) which asks ONC to address the interoperability challenges. HIMSS highlighted the 2 sections which apply to ONC and healthcare interoperability:

Office of the National Coordinator for Information Technology – Information Blocking.

The Office of the National Coordinator for Information Technology (ONC) is urged to use its certification program judiciously in order to ensure certified electronic health record technology provides value to eligible hospitals, eligible providers and taxpayers. ONC should use its authority to certify only those products that clearly meet current meaningful use program standards and that do not block health information exchange. ONC should take steps to decertify products that proactively block the sharing of information because those practices frustrate congressional intent, devalue taxpayer investments in CEHRT, and make CEHRT less valuable and more burdensome for eligible hospitals and eligible providers to use. The Committee requests a detailed report from ONC no later than 90 days after enactment of this act regarding the extent of the information blocking problem, including an estimate of the number of vendors or eligible hospitals or providers who block information. This detailed report should also include a comprehensive strategy on how to address the information blocking issue.”

Office of the National Coordinator for Information Technology – Interoperability.

The agreement directs the Health IT Policy Committee to submit a report to the House and Senate Committees on Appropriations and the appropriate authorizing committees no later than 12 months after enactment of this act regarding the challenges and barriers to interoperability. The report should cover the technical, operational and financial barriers to interoperability, the role of certification in advancing or hindering interoperability across various providers, as well as any other barriers identified by the Policy Committee.”

Everyone is talking about the first section which talks about taking “steps to decertify products that proactively block the sharing of information.” This could be a really big deal. Unfortunately, I don’t see how this will have any impact.

First, it would be really hard to prove that an EHR vendor is proactively blocking information sharing as required by EHR certification. I believe it will be pretty easy for an EHR vendor to show that they meet the EHR certification criteria and can exchange information using those standards. From what I understand, the bigger problem is that you can pass EHR certification using various flavors of the standard.

It seems to me that Congress should have really focused on why the meaningful use requirements were so open ended as to not actually get us to a proper standard for interoperability. They kind of get to this with their comment “certify only those products that clearly meet current meaningful use program standards.” However, if the MU standards aren’t good, then it doesn’t do any good to make sure that EHR vendors are meeting the MU program standard.

Of course, I imagine ONC wasn’t ready to admit that the MU standard wasn’t sufficiently defined for quality interoperability. Hopefully this is what will be discovered in the second piece of direction ONC received.

I could be wrong, but I don’t think the problem is EHR vendors not meeting the MU certification criteria for interoperability. Instead, I think the problem is that the MU certification criteria isn’t good enough to achieve simple interoperability between EHR systems.

If you think otherwise, I’d love to be proven wrong. Does this really give ONC some power to go after bad actors?

As an extension to this discussion, Carl Bergman has a great post on EMR and EHR which talks about what’s been removed from this bill. It seems that the Unique Patient Identifier gag rule has been removed.

HL7 Backs Effort To Boost Patient Data Exchange

Posted on December 8, 2014 I Written By

Katherine Rourke is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Standards group Health Level Seven has kicked off a new project intended to increase the adoption of tech standards designed to improve electronic patient data exchange. The initiative, the Argonaut Project, includes just five EMR vendors and four provider organizations, but it seems to have some interesting and substantial goals.

Participating vendors include Athenahealth, Cerner, Epic, McKesson and MEDITECH, while providers include Beth Israel Deaconess Medical Center, Intermoutain  Healthcare, Mayo Clinic and Partners HealthCare. In an interesting twist, the group also includes SMART, Boston Children’s Hospital Informatics Program’s federally-funded mobile app development project. (How often does mobile get a seat at the table when interoperability is being discussed?) And consulting firm the Advisory Board Company is also involved.

Unlike the activity around the much-bruited CommonWell Alliance, which still feels like vaporware to industry watchers like myself, this project seems to have a solid technical footing. On the recommendation of a group of science advisors known as JASON, the group is working at creating a public API to advance EMR interoperability.

The springboard for its efforts is HL7’s Fast Healthcare Interoperability Resources. HL7’s FHir is a RESTful API, an approach which, the standards group notes, makes it easier to share data not only across traditional networks and EMR-sharing modular components, but also to mobile devices, web-based applications and cloud communications.

According to JASON’s David McCallie, Cerner’s president of medical informatics, the group has an intriguing goal. Members’ intent is to develop a health IT operating system such as those used by Apple and Android mobile devices. Once that was created, providers could then use both built-in apps resident in the OS and others created by independent developers. While the devices a “health IT OS” would have to embrace would be far more diverse than those run by Android or iOS, the concept is still a fascinating one.

It’s also neat to hear that the collective has committed itself to a fairly aggressive timeline, promising to accelerate current FHIT development to provide hands-on FHIR profiles and implementation guides to the healthcare world by spring of next year.

Lest I seem too critical of CommonWell, which has been soldiering along for quite some time now, it’s onlyt fair to note that its goals are, if anything, even more ambitious than the Argonauts’. CommonWell hopes to accomplish nothing less than managing a single identity for every person/patient, locating the person’s records in the network and managing consent. And CommonWell member Cerner recently announced that it would provide CommonWell services to its clients for free until Jan. 1, 2018.

But as things stand, I’d wager that the Argonauts (I love that name!) will get more done, more quickly. I’m truly eager to see what emerges from their efforts.

Are You A Sitting Duck for HIPAA Data Breaches? – Infographic

Posted on November 18, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

The people at DataMotion, cloud based HISP providers, sent me the following infographic covering the HIPAA data breaches. It’s a good reminder of the potential for data breaches in healthcare. As Marc Probst recently suggested, we should be focusing as much attention on things like security as we are on meaningful use since the penalties for a HIPAA violation are more than the meaningful use penalties.

Are You A Sitting Duck for HIPAA Data Breaches Infographic

Healthcare Interoperability Series Outline

Posted on November 7, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Interoperability is one of the major priorities of ONC. Plus, I hear many doctors complaining that their EHR doesn’t live up to its potential because the EHR is not interoperable. I personally believe that healthcare would benefit immeasurably from interoperable healthcare records. The problem is that healthcare interoperability is a really hard nut to crack

With that in mind, I’ve decided to do a series of blog posts highlighting some of the many challenges and issues with healthcare interoperability. Hopefully this will provide a deeper dive into what’s really happening with healthcare interoperability, what’s holding us back from interoperability and some ideas for how we can finally achieve interoperable healthcare records.

As I started thinking through the subject of Healthcare Interoperability, here are some of the topics, challenges, issues, discussions, that are worth including in the series:

  • Interoperability Benefits
  • Interoperability Risks
  • Unique Identifier (Patient Identification)
  • Data Standards
  • Government vs Vendor vs Healthcare Organization Efforts and Motivations
  • When Should You Share The Data and When Not?
  • Major Complexities (Minors, Mental Health, etc)
  • Business Model

I think this is a good start, but I’m pretty sure this list is not comprehensive. I’d love to hear from readers about other issues, topics, questions, discussion points, barriers, etc to healthcare interoperability that I should include in this discussion. If you have some insights into any of these topics, I’d love to hear it as well. Hopefully we can contribute to a real understanding of healthcare interoperability.

Karen DeSalvo and Jacob Reider Leave ONC

Posted on October 24, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

UPDATE: It seems that DeSalvo will still be National Coordinator of Healthcare IT along with her new position.

It’s been a tumultuous few months for ONC and it’s just gotten even more tumultuous. We previously reported about the departures of Doug Fridsma MD, ONC’s Chief Science Officer, Joy Pritts, the first Chief Privacy Officer at ONC, and Lygeia Ricciardi, Director of the Office of Consumer eHealth, and Judy Murphy, Chief Nursing Officer (CNO) from ONC. Yesterday, the news dropped that Karen DeSalvo, ONC’s National Coordinator, and Jacob Reider, ONC’s Deputy National Coordinator, are both leaving ONC as well.

Karen DeSalvo has been tapped by HHS Secretary Sylvia Mathews Burwell to replace Wanda K. Jones as assistant secretary of health which oversees the surgeon general’s office and will be working on Ebola and other pressing health issues. I think DeSalvo’s letter to staff describes it well:

As you know, I have deep roots and a belief in public health and its critical value in assuring the health of everyone, not only in crisis, but every day, and I am honored to be asked to step in to serve.

DeSalvo’s always been a major public health advocate and that’s where her passion lies. Her passion isn’t healthcare technology. So, this change isn’t surprising. Although, it is a little surprising that it comes only 10 months into her time at ONC.

The obvious choice as Acting National Coordinator would have been Jacob Reider who was previously Acting National Coordinator when Farzad Mostashari left. However, Reider also announced his decision to leave ONC:

In light of the events that led to Karen’s announcement today–it’s appropriate now to be clear about my plans, as well. With Jon White and Andy Gettinger on board, and a search for a new Deputy National Coordinator well underway, I am pleased that much of this has now fallen into place–with only a few loose ends yet to be completed. I’ll remain at ONC until late November, working closely with Lisa as she assumes her role as Acting National Coordinator.

As Reider mentions, Lisa Lewis who is currently ONC’s COO will be serving as Acting National Coordinator at ONC.

What’s All This Mean?
There’s a lot of speculation as to why all of these departures are happening at ONC. Many people believe that ONC is a sinking ship and people are doing everything they can to get off the ship before it sinks completely. Others have suggested that these people see an opportunity to make a lot more money working for a company. The government certainly doesn’t pay market wages for the skills these people have. Plus, their connections and experience at ONC give them some unique qualifications that many companies are willing to pay to get. Some have suggested that the meaningful use work is mostly done and so these people want to move on to something new.

My guess is that it’s a mix of all of these things. It’s always hard to make broad generalizations about topics like this. For example, I already alluded to the fact that I think Karen DeSalvo saw an opportunity to move to a position that was more in line with her passions. Hard to fault someone for making that move. We’d all do the same.

What is really unclear is the future of ONC. They still have a few years of meaningful use which they’ll have to administer including the EHR penalties which could carry meaningful use forward for even longer than just a few years. I expect ONC will still have money to work on things like interoperability. We’ll see if ONC can put together the patient safety initiative they started or if that will get shut down because it’s outside their jurisdiction.

Beyond those things, what’s the future of ONC?