Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

Learning Health Care System

Posted on March 27, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

In a recent post by Andy Oram on EMR and EHR titles “Exploring the Role of Clinical Documentation: a Step Toward EHRs for Learning” he introduced me to the idea of what he called a Learning Health Care System. Here’s his description:

Currently a popular buzzword, a learning health care system collects data from clinicians, patients, and the general population to look for evidence and correlations that can improve the delivery of health care. The learning system can determine the prevalence of health disorders in an area, pick out which people are most at risk, find out how well treatments work, etc. It is often called a “closed loop system” because it can draw on information generated from within the system to change course quickly.

I really love the concept and description of a learning healthcare system. Unfortunately, I see so very little of this in our current EHR technology and that’s a travesty. However, it’s absolutely the way we need to head. Andy add this insight into why we don’t yet have a learning health care system:

“Vendors need to improve the ability of systems to capture and manage structured data.” We need structured data for our learning health care system, and we can’t wait for natural language processing to evolve to the point where it can reliably extract the necessary elements of a document.

While I agree that managed structured data would be helpful in reaching the vision of a learning healthcare system, I don’t think we have to wait for that to happen. We can already use the data that’s available to make our EHRs smarter than they are today. Certainly we can’t do everything that we’d like to do with them, but we can do something. We shouldn’t do nothing just because we can’t do everything.

Plus, I’ve written about this a number of times before, but we need to create a means for the healthcare system to learn and for healthcare systems to be able to easily share that learning. This might be a different definition of leaning than what Andy described. I think he was referencing a learning system that learns about the patient. I’m taking it one step further and we need a healthcare system that learns something about technology or data to be able to easily share that learning with other outside healthcare systems. That would be powerful.

What are your thoughts on what Andy calls a popular buzzword: A Learning Health Care System? Are we heading that direction? What’s holding us back?

Finding Simple Healthcare IT Solutions to Annoying Problems

Posted on March 23, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

In my recent video interview with Lindy Benton, CEO of MEA|NEA, I came away with the feeling that there are a wide variety of simple healthcare IT solutions for many of the problems that annoy us in healthcare. In Lindy’s case, they work on solving the secure document transfer problem in healthcare. They work mostly with claims remediation and other billing related documentation, but the secure document transfer applies to a lot of areas of healthcare.

As a tech person, I was interested in how rather simple technology can solve such an important problem. However, Lindy and I talk about why many organizations still haven’t adopted these technologies in their office (Spoiler: The divide between billing organizations and IT). We also talk about why EHR vendors aren’t just providing these types of secure document transfer solutions.

You can watch my full video interview with Lindy Benton below:

Recorded Video from Dell Healthcare Think Tank Event – #DoMoreHIT

Posted on March 20, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I mentioned that I was going to be on the Dell Healthcare Think Tank event again this year. It was my 3rd time participating and it didn’t disappoint. In fact, this one dove into a number of insurance topics which we hadn’t ever covered before. I really learned a lot from the discussions and hopefully others learned from me.

Plus, in the first session I had the privilege to sit next to Dr. Eric Topol. He’s got such great insights into what’s happening in healthcare. Of course, I’m also always amazed by Mandi Bishop, who many of you may know from Twitter or her Eyes Wide Shut series here on EMR and HIPAA.

In case you missed the live stream of the event, you can find each of the three recorded sessions below. I also posted the 3 drawings that were created during the event on EMR and EHR. I look forward to hearing your thoughts on what was shared. Thanks Dell for hosting the conversation that brought together so many perspectives from across healthcare.

Session 1: Consumer Engagement & Social Media

Session 2: Bridging the Gap Between Providers, Payers and Patients

Session 3: Entrepreneurship & Innovation

The Future Of…Healthcare Big Data

Posted on March 12, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

This post is part of the #HIMSS15 Blog Carnival which explores “The Future of…” across 5 different healthcare IT topics.

In yesterday’s post about The Future of…The Connected Healthcare System, I talked a lot about healthcare data and the importance of that data. So, I won’t rehash those topics in this post. However, that post will serve as background for why I believe healthcare has no clue about what big data really is and what it will mean for patients.

Healthcare Big Data History
If we take a quick look back in the history of big data in healthcare, most people will think about the massive enterprise data warehouses that hospitals invested in over the years. Sadly, I say they were massive because the cost of the project was massive and not because the amount of data was massive. In most cases it was a significant amount of data, but it wasn’t overwhelming. The other massive part was the massive amount of work that was required to acquire and store the data in a usable format.

This is what most people think about when they think of big data in healthcare. A massive store of a healthcare system’s data that’s been taken from a variety of disparate systems and normalized into one enterprise data warehouse. The next question we should be asking is, “what were the results of this effort?”

The results of this effort is a massive data store of health information. You might say, “Fantastic! Now we can leverage this massive data store to improve patient health, lower costs, improve revenue, and make our healthcare organization great.” That’s a lovely idea, but unfortunately it’s far from the reality of most enterprise data warehouses in healthcare.

The reality is that the only outcome was the enterprise data warehouse. Most project plans didn’t include any sort of guiding framework on how the enterprise data warehouse would be used once it was in place. Most didn’t include budget for someone (let alone a team of people) to mine the data for key organization and patient insights. Nope. Their funding was just to roll out the data warehouse. Organizations therefore got what they paid for.

So many organizations (and there might be a few exceptions out there) thought that by having this new resource at their fingertips, their staff would somehow magically do the work required to find meaning in all that data. It’s a wonderful thought, but we all know that it doesn’t work that way. If you don’t plan and pay for something, it rarely happens.

Focused Data Efforts
Back in 2013, I wrote about a new trend towards what one company called Skinny Data. No doubt that was a reaction to many people’s poor experiences spending massive amounts of money on an enterprise data warehouse without any significant results. Healthcare executives had no doubt grown weary of the “big data” pitch and were shifting to only want to know what results the data could produce.

I believe this was a really healthy shift in the use of data in a healthcare organization. By focusing on the end result, you can do a focused analysis and aggregation of the right data to be able to produce high quality results for an organization. Plus, if done right, that focused analysis and aggregation of data can serve as the basis for other future projects that will use some of the same data.

We’re still deep in the heart of this smart, focused healthcare data experience. The reality is that healthcare can still benefit so much from small slices of data that we don’t need to go after the big data analysis. Talk about low hanging fruit. It’s everywhere in healthcare data.

The Future of Big Data
In the future, big data will matter in healthcare. However, we’re still laying the foundation for that work. Many healthcare organizations are laying a great foundation for using their data. Brick by brick (data slice by data slice if you will), the data is being brought together and will build something amazingly beautiful.

This house analogy is a great one. There are very few people in the world that can build an entire house by themselves. Instead, you need some architects, framers, plumbers, electricians, carpenters, roofers, painters, designers, gardeners, etc. Each one contributes their expertise to build something that’s amazing. If any one of them is missing, the end result isn’t as great. Imagine a house without a plumber.

The same is true for big data. In most healthcare organizations they’ve only employed the architect and possibly bought some raw materials. However, the real value of leveraging big data in healthcare is going to require dozens of people across an organization to share their expertise and build something that’s amazing. That will require a serious commitment and visionary leadership to achieve.

Plus, we can’t be afraid to share our expertise with other healthcare organizations. Imagine if you had to invent cement every time you built a house. That’s what we’re still doing with big data in healthcare. Every organization that starts digging into their data is having to reinvent things that have already been solved in other organizations.

I believe we’ll solve this problem. Healthcare organizations I know are happy to share their findings. However, we need to make it easy for them to share, easy for other organizations to consume, and provide appropriate compensation (financial and non-financial). This is not an easy problem to solve, but most things worth doing aren’t easy.

The future of big data in healthcare is extraordinary. As of today, we’ve barely scraped the surface. While many may consider this a disappointment, I consider it an amazing opportunity.

The Future Of…The Connected Healthcare System

Posted on March 11, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

This post is part of the #HIMSS15 Blog Carnival which explores “The Future of…” across 5 different healthcare IT topics.

As I think about the future of a connected healthcare system, I get very excited. Although, that excitement is partially tamed by the realization that many of these connections could have been happening for a long time. A connected healthcare system is not a technological challenge, but is a major cultural challenge for healthcare.

The Data Connected Healthcare System
Implementation challenges aside, the future of healthcare absolutely revolves around a connected healthcare system. In the short term these connections will focus on sharing the right data with the right person at the right time. Most of that data will be limited to data inside the EHR. What’s shocking is that we’re not doing this already. I guess we are doing this already, but in a really disconnected fashion (see Fax machine). That’s what’s so shocking. We already have the policies in place that allow us to share healthcare data with other providers. We’re sharing that data across fax machines all day every day. Over the next 3-5 years we’ll see a continuous flow of this data across other electronic channels (Direct Project, FHIR, HIEs, etc).

More exciting to consider is the future integration of consumer health device data into the healthcare system. I’m certain I’ll see a number of stories talking about this integration at HIMSS already. These “pilot” integrations will set the groundwork for much wider adoption of external consumer health data. The key tipping point to watch for in this is when EHR vendors start accepting this data and presenting the data to doctors in a really intuitive way. This integration will absolutely change the game when it comes to connecting patient collected data with the healthcare system.

What seems even more clear to me is that we all still have a very myopic view of how much data we’re going to have available to us about a person’s health. In my above two examples I talk about the EHR patient record (basically physician’s charts) and consumer health devices. In the later example I’m pretty sure you’re translating that to the simple examples of health tracking we have today: steps, heart rate, weight, blood pressure, etc. While all of this data is important, I think it’s a short sighted view of the explosion of patient data we’ll have at our fingertips.

I still remember when I first heard the concept of an IP Address on Every Organ in your body reporting back health data that we would have never dreamed imaginable. The creativity in sensors that are detecting anything and everything that’s happening in your blood, sweat and tears is absolutely remarkable. All of that data will need to be connected, processed, and addressed. How amazing will it be for the healthcare system to automatically schedule you for heart surgery that will prevent a heart attack before you even experience any symptoms?

Of course, we haven’t even talked about genomic data which will be infiltrating the healthcare system as well. Genomic data use to take years to process. Now it’s being done in weeks at a price point that’s doable for many. Genomic medicine is going to become a standard for healthcare and in some areas it is already.

The connected healthcare system will have to process more data than we can even imagine today. Good luck processing genomic data, sensor data, device data, and medical chart data using paper.

It’s All About Communication
While I’ve focused on connecting the data in the healthcare system of the future, that doesn’t downplay the need for better communication tools in the future connected healthcare system. Healthcare data can discover engagement points, but communication with patients will cause the change in our healthcare system.

Do you feel connected to your doctor today? My guess is that most of you would be like me and say no (Although, I’m working to change that culture for me and my family). The future connected healthcare system is going to have to change that culture if we want to improve healthcare and lower healthcare costs. Plus, every healthcare reimbursement model of the future focuses on this type of engagement.

The future connected healthcare system actually connects the doctor’s office and the patient to treat even the healthy patient. In fact, I won’t be surprised if we stop talking about going for a doctor’s visit and start talking about a health check up or some health maintenance. Plus, who says the health check up or maintenance has to be in the doctors office. It might very well be over a video chat, email, instant message, social media, or even text.

This might concern many. However, I’d describe this as healthcare integration into your life. We’ll have some stumbles along the way. We’ll have some integrations that dig too deeply into your life. We’ll have some times when we rely too heavily on the system and it fails us. Sometimes we’ll fail to show the right amount of empathy in the communication. Sometimes we’ll fail to give you the needed kick in the pants. Sometimes, we’ll make mistakes. However, over time we’ll calibrate the system to integrate seamlessly into your life and improve your health based on your personalized needs.

The future Connected Healthcare System is a data driven system which facilitates the right communication when and where it’s needed in a seamless fashion.

Top Ten Reasons for EHR’s to Use Middleware for Connectivity

Posted on March 10, 2015 I Written By

The following is a guest blog post by Thanh Tran, CEO, Zoeticx, Inc.
Thanh Tran, CEO, Zoeticx
Where should CIOs and IT professionals look to address EHR interoperability?  Middleware!

A middleware architecture has been shown to be the best technological solution for addressing the problem of EHR interoperability. The middleware platform facilitates the transparent, yet secure, access of patient health data, directly from the various databases where it is stored. A server-based middleware framework supporting access to the various patient health data stores allows for a scalable, unified and standardized platform for applications to be developed upon. The middleware architectural design has been successfully used to link data from multiple databases, irrespective to the database platform or where the database is located.

Don’t take my word for it.  Here are ten good reasons to consider middleware.

  • Application Developers Can Focus on Healthcare Apps—Enables medical record app developers to focus on their healthcare solution by freeing them from dealing with a diverse, complex EHR infrastructure.
  • Inspires the Next Generation of Healthcare Innovative Solutions—These solutions are inspired by expanding the market for the next generation of healthcare applications rather than being tied down to a stack approach, depending on the particular EHR vendor.
  • Improves Patient Care OutcomesPatients will receive better healthcare outcomes when application developers can inspire more Patients will also benefit from the next generation of applications as they will address providers’ specific needs in diverse operational care environments.
  • Saves Healthcare IT Dollars—Focuses the healthcare IT budget on addressing providers’ needs instead of building and re-building the patient record infrastructure.
  • Proven Technology—A proven technology used for decades in many industries such as financial, retail, manufacturing and other markets.
  • Easy Integration—Enables healthcare integration with diverse, deployed legacy systems, including EHR systems. It addresses EHR interoperability as part of overall integration challenges.
  • Passive to Active Healthcare IT Environment—It turns passive healthcare IT environments into active ones to enhance communication and collaboration among care providers.
  • Avoids Data Duplication—Cost efficient, simplified administration. Offers a better privacy protection solution than HIEs by addressing EHR interoperability while fulfilling the demand to support the patient care continuum in an operational care environment.
  • Eliminates Wastefulness—Addressing healthcare IT integration is much more cost efficient than the “Rip-and-Replace” approach.
  • Extends EHR Usefulness—Protects and extends healthcare IT investments in EHR and EMR systems.

About Thanh Tran
Thanh Tran is CEO of Zoeticx, Inc., a medical software company located in San Jose, CA. He is a 20 year veteran of Silicon Valley’s IT industry and has held executive positions at many leading software companies.

6 Healthcare Interoperability Myths

Posted on February 9, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

With my new fascination with healthcare interoperability, I’m drawn to anything and everything which looks at the successes and challenges associated with it. So, it was no surprised that I was intrigued by this whitepaper that looks at the 6 Healthcare Interoperability Myths.

For those who don’t want to download the whitepaper for all the nitty gritty details, here are the 6 myths:

  1. One Size Fits All
  2. There Is One Standard to Live By
  3. I Can Only “Talk” to Providers on the Same EHR as Mine
  4. If I Give Up Control of My Data, I’ll Lose Patients
  5. Hospitals Lead in Interoperability
  6. Interoperability Doesn’t Really “Do” Anything. It’s Just a Fad like HMOs in the 90s

You can read the whole whitepaper if you want to read all the details about each myth.

The first two hit home to me and remind me of my post about achieving continuous healthcare interoperability. I really think that the idea of every health IT vendor “interpreting” the standard differently is an important concept that needs to be dealt with if we want to see healthcare interoperability happen.

Another concept I’ve been chewing on is whether everyone believes that healthcare interoperability is the right path forward. The above mentioned whitepaper starts off with a strong statement that, “It’s no tall tale. Yes. We need interoperability.” While this is something I believe strongly, I’m not sure that everyone in healthcare agrees.

I’d love to hear your thoughts. Do we all want healthcare interoperability or are there are a lot of people out there that aren’t sure if healthcare interoperability is the right way forward?

How Do We Achieve Continuous Healthcare Interoperability?

Posted on February 2, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Today I had a really interesting chat about healthcare interoperability with Mario Hyland, Founder of AEGIS. I’m looking at a number of ways that Mario and I can work together to move healthcare interoperability forward. We’ll probably start with a video hangout with Mario and then expand from there.

Until then, I was struck by something Mario said in our conversation: “Healthcare interoperability is not a point in time. You can be interoperable today and then not be tomorrow.

This really resonated with me and no doubt resonates with doctors and hospitals who have an interface with some other medical organization. You know how easy it is for your interface to break. It’s never intentional, but these software are so large and complex that someone will make a change and not realize the impact that change will have across all your connections. As I wrote on Hospital EMR and EHR, API’s are Hard!

Currently we don’t even have a bunch of complex APIs with hundreds of companies connecting to the EHR. We’re lucky if an EHR has a lab interface, ePrescribing, maybe a radiology interface, and maybe a connection to a local hospital. Now imagine the issues that crop up when you’re connecting to hundreds of companies and systems. Mario was right when he told me, “Healthcare thinks we’re dealing with the complex challenges of healthcare interoperability. Healthcare doesn’t know the interoperability challenges that still wait for them and they’re so much more complex than what we’re dealing with today.”

I don’t want to say this as discouragement, but it should encourage us to be really thoughtful about how we handle healthcare interoperability so it can scale up. The title of this post asks a tough question that isn’t being solved by our current one time approach to certification. How do we achieve continuous healthcare interoperability that won’t break on the next upgrade cycle?

I asked Mario why the current EHR certification process hasn’t been able to solve this problem and he said that current EHR certification is more of a one time visual inspection of interoperability. Unfortunately it doesn’t include a single testing platform that really tests an EHR against a specific interoperability standard, let alone ongoing tests to make sure that any changes to the EHR don’t affect future interoperability.

I’ve long chewed on why it is that we can have a “standard” for interoperability, but unfortunately that doesn’t mean that EHR systems can actually interoperate. I’ve heard people tell me that there are flavors of the standard and each organization has a different flavor. I’ve seen this, but what I’ve found is that there are different interpretations of the same standard. When you dig into the details of any standard, you can see how it’s easy for an organization to interpret a standard multiple ways.

In my post API’s are Hard, the article that is linked talks about the written promise and the behavioral promise of an API. The same thing applies to a healthcare interoperability standard. There’s the documented standard (written promise), and then there’s the way the EHR implements the standard (behavioral promise).

In the API world, one company creates the API and so you have one behavioral promise to those who use it. Even with one company, tracking the behavioral promise can be a challenge. In the EHR world, each EHR vendor has implemented interoperability according to their own interpretation of the standard and so there are 300+ behavioral promises that have to be tracked and considered. One from each company and heaven help us if and when a company changes that behavioral promise. It’s impossible to keep up with and explains one reason why healthcare intoperability isn’t a reality today.

What’s the solution? One solution is to create a set of standard test scripts that can be tested against by any EHR vendor on an ongoing basis. This way any EHR vendor can test the interoperability functionality of their application throughout their development cycle. Ideally these test scripts would be offered in an open source manner which would allow multiple contributors to continue to provide feedback and improve the test scripts as errors in the test scripts are found. Yes, it’s just the nature of any standard and testing of that standard that exceptions and errors will be found that need to be addressed and improved.

I mentioned that I was really interested in diving in deeper to healthcare interoperability. I still have a lot more deeper to go, but consider this the first toe dip into the healthcare interoperability waters. I really want to see this problem solved.

Speeding Sepsis Response by Integrating Key Technology

Posted on January 26, 2015 I Written By

Stephen Claypool, M.D., is Vice President of Clinical Development & Informatics, Clinical Solutions, with Wolters Kluwer Health and Medical Director of its Innovation Lab. He can be reached at steve.claypool@wolterskluwer.com.
Stephen Claypool - WKH
Three-week-old Jose Carlos Romero-Herrera was rushed to the ER, lethargic and unresponsive with a fever of 102.3. His mother watched helplessly as doctors, nurses, respiratory therapists and assorted other clinicians frantically worked to determine what was wrong with an infant who just 24 hours earlier had been healthy and happy.

Hours later, Jose was transferred to the PICU where his heart rate remained extremely high and his blood pressure dangerously low. He was intubated and on a ventilator. Seizures started. Blood, platelets, plasma, IVs, and multiple antibiotics were given. Still, Jose hovered near death.

CT scans, hourly blood draws and EEGs brought no answers. Despite all the data and knowledge available to the clinical team fighting for Jose’s life, it was two days before the word “sepsis” was uttered. By then, his tiny body was in septic shock. It had swelled to four times the normal size. The baby was switched from a ventilator to an oscillator. He received approximately 16 different IV antibiotics, along with platelets, blood, plasma, seizure medications and diuretics.

“My husband and I were overwhelmed at the equipment in the room for such a tiny little person. We were still in shock about how we’d just sat there and enjoyed him a few hours ago and now were being told that we may not be bringing him back home with us,” writes Jose’s mother, Edna, who shared the story of her baby’s 30-day ordeal as part of the Sepsis Alliance’s “Faces of Sepsis” series.

Jose ultimately survived. Many do not. Three-year-old Ivy Hayes went into septic shock and died after being sent home from the ER with antibiotics for a UTI. Larry Przybylski’s mother died just days after complaining of a “chill” that she suspected was nothing more than a 24-hour bug.

Sepsis is the body’s overwhelming, often-fatal immune response to infection. Worldwide, there are an estimated 8 million deaths from sepsis, including 750,000 in the U.S. At $20 billion annually, sepsis is the single most expensive condition treated in U.S. hospitals.

Hampering Efforts to Fight Sepsis

Two overarching issues hamper efforts to drive down sepsis mortality and severity rates.

First, awareness among the general population is surprisingly low. A recent study conducted by The Harris Poll on behalf of Sepsis Alliance found that just 44% of Americans had ever even heard of sepsis.

Second, the initial presentation of sepsis can be subtle and its common signs and symptoms are shared by multiple other illnesses. Therefore, along with clinical acumen, early detection requires the ability to integrate and track multiple data points from multiple sources—something many hospitals cannot deliver due to disparate systems and siloed data.

While the Sepsis Alliance focuses on awareness through campaigns including Faces of Sepsis and Sepsis Awareness Month, hospitals and health IT firms are focused on reducing rates by arming clinicians with the tools necessary to rapidly diagnose and treat sepsis at its earliest stages.

A primary clinical challenge is that sepsis escalates rapidly, leading to organ failure and septic shock, resulting in death in nearly 30 percent of patients. Every hour without treatment significantly raises the risk of death, yet early screening is problematic. Though much of the data needed to diagnose sepsis already reside within EHRs, most systems don’t have the necessary clinical decision support content or informatics functionality.

There are also workflow issues. Inadequate cross-shift communication, challenges in diagnosing sepsis in lower-acuity areas, limited financial resources and a lack of sepsis protocols and sepsis-specific quality metrics all contribute to this intractable issue.

Multiple Attack Points

Recognizing the need to attack sepsis from multiple angles, our company is testing a promising breakthrough in the form of POC Advisor™. The program is a holistic approach that integrates advanced technology with clinical change management to prevent the cascade of adverse events that occur when sepsis treatment is delayed.

This comprehensive platform is currently being piloted at Huntsville Hospital in Alabama and John Muir Medical Center in California. It works by leveraging EHR data and automated surveillance, clinical content and a rules engine driven by proprietary algorithms to begin the sepsis evaluation process. Mobile technology alerts clinical staff to evaluate potentially septic patients and determine a course of treatment based on their best clinical judgment.

For a truly comprehensive solution, it is necessary to evaluate specific needs at each hospital. That information is used to expand sepsis protocols and add rules, often hundreds of them, to improve sensitivity and specificity and reduce alert fatigue by assessing sepsis in complex clinical settings. These additional rules take into account comorbid medical conditions and medications that can cause lab abnormalities that may mimic sepsis. This helps to ensure alerts truly represent sepsis.

The quality of these alerts is crucial to clinical adoption. They must be both highly specific and highly sensitive in order to minimize alert fatigue. In the case of this specific system, a 95% specificity and sensitivity rating has been achieved by constructing hundreds of variations of sepsis rules. For example, completely different rules are run for patients with liver disease versus those with end-stage renal disease. Doing so ensures clinicians only get alerts that are helpful.

Alerts are also coupled with the best evidence-based recommendations so the clinical staff can decide which treatment path is most appropriate for a specific patient.

The Human Element

To address the human elements impacting sepsis rates, the system in place includes clinical change management to develop best practices, including provider education and screening tools and protocols for early sepsis detection. Enhanced data analytics further manage protocol compliance, public reporting requirements and real-time data reporting, which supports system-wide best practices and performance improvement.

At John Muir, the staff implemented POC Advisor within two medical/surgical units for patients with chronic kidney disease and for oncology patient populations. Four MEDITECH interfaces sent data to the platform, including lab results, pharmacy orders, Admit Discharge Transfer (ADT) and vitals/nursing documentation. A clinical database was created from these feeds, and rules were applied to create the appropriate alerts.

Nurses received alerts on a VoIP phone and then logged into the solution to review the specifics and determine whether they agree with the alerts based on their clinical training. The system prompted the nursing staff to respond to each one, either through acknowledgement or override. If acknowledged, suggested guidance regarding the appropriate next steps was provided, such as alerting the physician or ordering diagnostic lactate tests, based on the facility’s specific protocols. If alerts were overridden, a reason had to be entered, all of which were logged, monitored and reported. If action was not taken, repeat alerts were fired, typically within 10 minutes. If repeat alerts were not acted upon, they were escalated to supervising personnel.

Over the course of the pilot, the entire John Muir organization benefited from significant improvements on several fronts:

  • Nurses were able to see how data entered into the EHR was used to generate alerts
  • Data could be tracked to identify clinical process problems
  • Access to clinical data empowered the quality review team
  • Nurses reported being more comfortable communicating quickly with physicians based on guidance from the system and from John Muir’s standing policies

Finally, physicians reported higher confidence in the validity of information relayed to them by the nursing staff because they knew it was being communicated based on agreed upon protocols.

Within three months, John Muir experienced significant improvements related to key sepsis compliance rate metrics. These included an 80% compliance with patient screening protocols, 90% lactate tests ordered for patients who met screening criteria and 75% initiation of early, goal-directed therapy for patients with severe sepsis.

Early data from Huntsville Hospital is equally promising, including a 37% decline in mortality on patient floors where POC Advisor was implemented. Thirty-day readmissions have declined by 22% on screening floors, and data suggest documentation improvements resulting from the program may positively impact reimbursement levels.

This kind of immediate outcome is generating excitement at the pilot hospitals. Though greater data analysis is still necessary, early indications are that a multi-faceted approach to sepsis holds great promise for reducing deaths and severity.

Defining the Legal Health Record, Ensuring Quality Health Data, and Managing a Part-Paper Part-Electronic Record – Healthcare Information Governance

Posted on January 20, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

This post is part of Iron Mountain’s Healthcare Information Governance: Big Picture Predictions and Perspectives Series which looks at the key trends impacting Healthcare Information Governance. Be sure to check out all the entries in this series.

Healthcare information governance (IG) has been important ever since doctors started tracking their patients in paper charts. However, over the past few years, adoption of EHR and other healthcare IT systems has exploded and provided a myriad of new opportunities and challenges associated with governance of a healthcare organization’s information.

Three of the most important health information governance challenges are:
1. Defining the legal health record
2. Ensuring quality health data
3. Managing a part-paper, part-electronic record

Defining the Legal Health Record
In the paper chart world, defining the legal health record was much easier. As we’ve shifted to an electronic world, the volume of data that’s stored in these electronic systems is so much greater. This has created a major need to define what your organization considers the legal health record.

The reality is that each organization now has to define its own legal health record based on CMS and accreditation guidelines, but also based on the specifics of their operation (state laws, EHR options, number of health IT systems, etc). The legal health record will only be a subset of the data that’s being stored by an EHR or other IT system and you’ll need to involve a wide group of people from your organization to define the legal health record.

Doing so is going to become increasingly important. Without a clearly defined legal health record, you’re going to produce an inconsistent release of information. This can lead to major liability issues in court cases where you produce inconsistent records, but it’s also important to be consistent when releasing health information to other doctors or even auditors.

One challenge we face in this regard is ensuring that EHR vendors provide a consistent and usable data output. A lot of thought has been put into how data is inputted into the EHR, but not nearly as much effort has been put into the way an EHR outputs that data. This is a major health information governance challenge that needs to be addressed. Similarly, most EHR vendors haven’t put much thought and effort into data retention either. Retention policies are an important part of defining your legal health record, but your policy is subject to the capabilities of the EHR.

Working with your EHR and other healthcare IT vendors to ensure they can produce a consistent legal health record is one strategic imperative that every healthcare organization should have on their list.

Ensuring Quality Health Data
The future of healthcare is very much going to be data driven. Payments to ACO organizations are going to depend on data. The quality of care you provide using Clinical Decision Support (CDS) systems is going to rely on the quality of data being used. Organizations are going to have new liability concerns that revolve around their organization’s data quality. Real time data interoperability is going to become a reality and everyone’s going to see everyone else’s data without a middleman first checking and verifying the quality of the data before it’s sent.

A great health information governance program led by a clinical documentation improvement (CDI) program is going to be a key first step for every organization. Quality data doesn’t happen over night, but requires a concerted effort over time. Organization need to start now if they want to be successful in the coming data driven healthcare world.

Managing a Part-Paper Part-Electronic Record
The health information world is becoming infinitely more complex. Not only do you have new electronic systems that store massive amounts of data, but we’re still required to maintain legacy systems and those old paper charts. Each of these requires time and attention to manage properly.

While we’d all love to just turn off legacy systems and dispose of old paper charts, data retention laws often mean that both of these will be part of every healthcare organization for many years to come. Unfortunately, most health IT project plans don’t account for ongoing management of these old but important data sources. This inattention often results in increased costs and risks associated with these legacy systems and paper charts.

It should be strategically important for every organization to have a sound governance plan for both legacy IT systems and paper charts. Ignorance is not bliss when one of these information sources is breached because your organization had “forgotten” about them.

The future of reimbursement, costs, quality of care, and liability in healthcare are all going to be linked to an organization’s data. Making sure your data governance house is in order is going to be a major component in the success or failure of your organization. A good place to start is defining the legal health record, ensuring quality health data, and managing a part-paper part-electronic record.

Join our Twitter Chat: “Healthcare IG Predictions & Perspectives”

On January 28th at 12:00 pm Eastern, @IronMtnHealth is hosting a Twitter chat using #InfoTalk to further the dialog. If you have been involved in governance-related projects, we’d love to have you join. What IG initiatives have shown success for you? How have you overcome any obstacles? What do you see as the future of IG? Keep the conversation going during our “Healthcare IG Predictions & Perspectives” #InfoTalk at 12pm Eastern on January 28th.