Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

Can Machine Learning Tame Healthcare’s Big Data?

Posted on September 20, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Big data is both a blessing and a curse. The blessing is that if we use it well, it will tell us important things we don’t know about patient care processes, clinical improvement, outcomes and more. The curse is that if we don’t use it, we’ve got a very expensive and labor-hungry boondoggle on our hands.

But there may be hope for progress. One article I read today suggests that another technology may hold the key to unlocking these blessings — that machine learning may be the tool which lets us harvest the big data fields. The piece, whose writer, oddly enough, was cited only as “Mauricio,” lead cloud expert at Cloudwards.net, argues that machine learning is “the most effective way to excavate buried patterns in the chunks of unstructured data.” While I am an HIT observer rather than techie, what limited tech knowledge I possess suggests that machine learning is going to play an important role in the future of taming big data in healthcare.

In the piece, Mauricio notes that big data is characterized by the high volume of data, including both structured and non-structured data, the high velocity of data flowing into databases every working second, the variety of data, which can range from texts and email to audio to financial transactions, complexity of data coming from multiple incompatible sources and variability of data flow rates.

Though his is a general analysis, I’m sure we can agree that healthcare big data specifically matches his description. I don’t know if you who are reading this include wild cards like social media content or video in their big data repositories, but even if you don’t, you may well in the future.

Anyway, for the purposes of this discussion, let’s summarize by saying that in this context, big data isn’t just made of giant repositories of relatively normalized data, it’s a whirlwind of structured and unstructured data in a huge number of formats, flooding into databases in spurts, trickles and floods around the clock.

To Mauricio, an obvious choice for extracting value from this chaos is machine learning, which he defines as a data analysis method that automates extrapolated model-building algorithms. In machine learning models, systems adapt independently without any human interaction, using automatically-applied customized algorithms and mathematical calculations to big data. “Machine learning offers a deeper insight into collected data and allows the computers to find hidden patterns which human analysts are bound to miss,” he writes.

According to the author, there are already machine learning models in place which help predict the appearance of genetically-influenced diseases such as diabetes and heart disease. Other possibilities for machine learning in healthcare – which he doesn’t mention but are referenced elsewhere – include getting a handle on population health. After all, an iterative learning technology could be a great choice for making predictions about population trends. You can probably think of several other possibilities.

Now, like many other industries, healthcare suffers from a data silo problem, and we’ll have to address that issue before we create the kind of multi-source, multi-format data pool that Mauricio envisions. Leveraging big data effectively will also require people to cooperate across departmental and even organizational boundaries, as John Lynn noted in a post from last year.

Even so, it’s good to identify tools and models that can help get the technical work done, and machine learning seems promising. Have any of you experimented with it?

OCHIN Shows That Messy Data Should Not Hold Back Health Care

Posted on September 12, 2016 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

The health care industry loves to complain about patient data. It’s full of errors, which can be equally the fault of patients or staff. And hanging over the whole system is lack of interoperability, which hampers research.

Well, it’s not as if the rest of the universe is a pristine source of well-formed statistics. Every field has to deal with messy data. And somehow retailers, financial managers, and even political campaign staff manage to extract useful information from the data soup. This doesn’t mean that predictions are infallible–after all, when I check a news site about the Mideast conflicts, why does the publisher think I’m interested in celebs from ten years ago whose bodies look awful now? But there is still no doubt that messy data can transform industry.

I’m all for standards and for more reliable means of collecting and vetting patient data. But for the foreseeable future, health care institutions are going to have to deal with suboptimal data. And OCHIN is one of the companies that shows how it can be done.

I recently had a chance to talk and see a demo of OCHIN’s analytical tool, Acuere, with CEO Abby Sears and the Vice President of Data Services and Integration, Clayton Gillett. Their basic offering is a no-nonsense interface that lets clinicians and administrator do predictions and hot-spotting.

Acuere is part of a trend in health care analytics that goes beyond clinical decision support and marshalls large amounts of data to help with planning (see an example screen in Figure 1). For instance, a doctor can rank her patients by the number of alerts the system generates (a patient with diabetes whose glucose is getting out of control, or a smoker who hasn’t received counseling for smoking cessation). An administrator can rank a doctor against others in the practice. This summary just gives a flavor of the many services Acuere can perform; my real thrust in this article is to talk about how OCHIN obtains and processes its data. Sears and Gillett talked about the following challenges and how they’re dealing with them.

Acuere Provider Report Card

Figure 1. Acuere Report Card in Acuere

Patient identification
Difficulties in identifying patients and matching their records has repeatedly surfaced as the biggest barrier to information exchange and use in the US health care system. A 2014 ONC report cites it as a major problem (on pages 13 and 20). An article I cited earlier also blames patient identification for many of the problems of health care analytics. But the American public and Congress have been hostile to unique identifiers for some time, so health care institutions just have to get by without them.

OCHIN handles patient matching as other institutions, such as Health Information Exchanges, do. They compare numerous fields of records–not just obvious identifiers such as name and social security number, but address, demographic information, and perhaps a dozen other things. Sears and Gillett said it’s also hard to knowing which patients to attribute to each health care provider.

Data sources
The recent Precision Medicine initiatives seeks to build “a national research cohort of one million or more U.S. participants.” But OCHIN already has a database on 7.6 million people and has signed more contracts to reach 10 million this Fall. Certainly, there will be advantages to the Precision Medicine database. First, it will contain genetic information, which OCHIN’s data suppliers don’t have. Second, all the information on each person will be integrated, whereas OCHIN has to take de-identified records from many different suppliers and try to integrate them using the techniques described in the previous section, plus check for differences and errors in order to produce clean data.

Nevertheless, OCHIN’s data is impressive, and it took a lot of effort to accumulate it. They get not only medical data but information about the patient’s behavior and environment. Along with 200 different vital signs, they can map the patient’s location to elements of the neighborhood, such as income levels and whether healthy food is sold in local stores.

They get Medicare data from qualified entities who were granted access to it by CMS, Medicaid data from the states, patient data from commercial payers, and even data on the uninsured (a population that is luckily shrinking) from providers who treat them. Each institution exports data in a different way.

How do they harmonize the data from these different sources? Sears and Gillett said it takes a lot of manual translation. Data is divided into seven areas, such as medications and lab results. OCHIN uses standards whenever possible and participates in groups that set standards. There are still labs that don’t use LOINC codes to report results, as well as pharmacies and doctors who don’t use RxNorm for medications. Even ICD-10 changes yearly, as codes come and go.

Data handling
OCHIN isn’t like a public health agency that may be happy sharing data 18 months after it’s collected (as I was told at a conference). OCHIN wants physicians and their institutions to have the latest data on patients, so they carry out millions of transactions each day to keep their database updated as soon as data comes in. Their analytics run multiple times every day, to provide the fast results that users get from queries.

They are also exploring the popular “big data” forms of analytics that are sweeping other industries: machine learning, using feedback to improve algorithms, and so on. Currently, the guidance they offer clinicians is based on traditional clinical recommendations from randomized trials. But they are seeking to expand those sources with other insights from light-weight methods of data analysis.

So data can be useful in health care. Modern analytics should be available to every clinician. After all, OCHIN has made it work. And they don’t even serve up ads for chronic indigestion or 24-hour asthma relief.

Improving Clinical Workflow Can Boost Health IT Quality

Posted on August 18, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

At this point, the great majority of providers have made very substantial investments in EMRs and ancillary systems. Now, many are struggling to squeeze the most value out of those investments, and they’re not sure how to attack the problem.

However, according to at least one piece of research, there’s a couple of approaches that are likely to pan out. According to a new survey by the American Society for Quality, most healthcare quality experts believe that improving clinical workflow and supporting patients online can make a big diference.

As ASQ noted, providers are spending massive amounts of case on IT, with the North American healthcare IT market forecast to hit $31.3 by 2017, up from $21.9 billion in 2012. But healthcare organizations are struggling to realize a return on their spending. The study data, however, suggests that providers may be able to make progress by looking at internal issues.

Researchers who conducted the survey, an online poll of about 170 ASQ members, said that 78% of respondents said improving workflow efficiency is the top way for healthcare organizations to improve the quality of their technology implementations. Meanwhile, 71% said that providers can strengthen their health IT use by nurturing strong leaders who champion new HIT initiatives.

Meanwhile, survey participants listed a handful of evolving health IT options which could have the most impact on patient experience and care coordination, including:

  • Incorporation of wearables, remote patient monitoring and caregiver collaboration tools (71%)
  • Leveraging smartphones, tablets and apps (69%)
  • Putting online tools in place that touch every step of patient processes like registration and payment (69%)

Despite their promise, there are a number of hurdles healthcare organizations must get over to implement new processes (such as better workflows) or new technologies. According to ASQ, these include:

  • Physician and staff resistance to change due to concerns about the impact on time and workflow, or unwillingness to learn new skills (70%)
  • High cost of rolling out IT infrastructure and services, and unproven ROI (64%)
  • Concerns that integrating complex new devices could lead to poor interfaces between multiple technologies, or that haphazard rollouts of new devices could cause patient errors (61%)

But if providers can get past these issues, there are several types of health IT that can boost ROI or cut cost, the ASQ respondents said. According to these participants, the following HIT tools can have the biggest impact:

  • Remote patient monitoring can cut down on the need for office visits, while improving patient outcomes (69%)
  • Patient engagement platforms that encourage patients to get more involved in the long-term management of their own health conditions (68%)
  • EMRs/EHRs that eliminate the need to perform some time-consuming tasks (68%)

Perhaps the most interesting part of the survey report outlined specific strategies to strengthen health IT use recommended by respondents, such as:

  • Embedding a quality expert in every department to learn use needs before deciding what IT tools to implement. This gives users a sense of investment in any changes made.
  • Improving available software with easier navigation, better organization of medical record types, more use of FTP servers for convenience, the ability to upload records to requesting facilities and a universal notification system offering updates on medical record status
  • Creating healthcare apps for professional use, such as medication calculators, med reconciliation tools and easy-to-use mobile apps which offer access to clinical pathways

Of course, most readers of this blog already know about these options, and if they’re not currently taking this advice they’re probably thinking about it. Heck, some of this should already be old hat – FTP servers? But it’s still good to be reminded that progress in boosting the value of health IT investments may be with reach. (To get some here-and-now advice on redesigning EMR workflow, check out this excellent piece by Chuck Webster – he gets it!)

ONC Announces Winners Of FHIR App Challenge

Posted on August 3, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

The ONC has announced the first wave of winners of two app challenges, both of which called for competitors to use FHIR standards and open APIs.

As I’ve noted previously, I’m skeptical that market forces can solve our industry’s broad interoperability problems, even if they’re supported and channeled by a neutral intermediary like ONC. But there’s little doubt that FHIR has the potential to provide some of the benefits of interoperability, as we’ll see below.

Winners of Phase 1 of the agency’s Consumer Health Data Aggregator Challenge, each of whom will receive a $15,000 award, included the following:

  • Green Circle Health’s platform is designed to provide a comprehensive family health dashboard covering the Common Clinical Data Set, using FHIR to transfer patient information. This app will also integrate patient-generated health data from connected devices such as wearables and sensors.
  • The Prevvy Family Health Assistant by HealthCentrix offers tools for managing a family’s health and wellness, as well as targeted data exchange. Prevvy uses both FHIR and Direct messaging with EMRs certified for Meaningful Use Stage 2.
  • Medyear’s mobile app uses FHIR to merge patient records from multiple sources, making them accessible through a single interface. It displays real-time EMR updates via a social media-style feed, as well as functions intended to make it simple to message or call clinicians.
  • The Locket app by MetroStar Systems pulls patient data from different EMRs together onto a single mobile device. Other Locket capabilities include paper-free check in and appointment scheduling and reminders.

ONC also announced winners of the Provider User Experience Challenge, each of whom will also get a $15,000 award. This part of the contest was dedicated to promoting the use of FHIR as well, but participants were asked to show how they could enhance providers’ EMR experience, specifically by making clinical workflows more intuitive, specific to clinical specialty and actionable, by making data accessible to apps through APIs. Winners include the following:

  • The Herald platform by Herald Health uses FHIR to highlight patient information most needed by clinicians. By integrating FHIT, Herald will offer alerts based on real-time EMR data.
  • PHRASE (Population Health Risk Assessment Support Engine) Health is creating a clinical decision support platform designed to better manage emerging illnesses, integrating more external data sources into the process of identifying at-risk patients and enabling the two-way exchange of information between providers and public health entities.
  • A partnership between the University of Utah Health Care, Intermountain Healthcare and Duke Health System is providing clinical decision support for timely diagnosis and management of newborn bilirubin according to evidence-based practice. The partners will integrate the app across each member’s EMR.
  • WellSheet has created a web application using machine learning and natural language processing to prioritize important information during a patient visit. Its algorithm simplifies workflows incorporating multiple data sources, including those enabled by FHIR. It then presents information in a single screen.

As I see it, the two contests don’t necessarily need to be run on separate tracks. After all, providers need aggregate data and consumers need prioritized, easy-to-navigate platforms. But either way, this effort seems to have been productive. I’m eager to see the winners of the next phase.

A Vision for Why and How We Make the Science of Health Care Shareable

Posted on October 30, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I recently heard Stan Huff, CMIO at Intermountain, talk at the Healthcare IT Transformation Assembly about the Healthcare Services Platform Consortium. As he presented what they’re working on he highlighted so well the challenges that I’ve been seeing in healthcare IT. I’ve long be asking people how healthcare IT innovations that happen in one hospital or practice are going to get shared with all of healthcare. Turns out, Stan has been thinking a lot about this problem as well.

In his presentation, Stan framed the discussion perfectly when he said, “No matter what you do, you can’t teach people to be perfect information processors.” I’d also mentioned in a previous post that the human mind can’t detect the difference between something that causes errors 3 in 100 versus 4 in 100. However, with the right data, computers can tell the difference. Plus, computers can assist humans in the information processing.

These points illustrate why building and sharing clinical decision support is so important. The human mind is incredible, but medicine is so complex it’s impossible for the human mind to process it all. Ideally all of the work that Stan Huff and his team at Intermountain are doing on clinical decision support should be “plug n play interoperable” with the rest of the healthcare system. That seems to be the goal of the Healthcare Services Platform Consortium.

Many might wonder why Intermountain would want to share all the work they’ve been doing with the rest of healthcare. Isn’t that their proprietary intellectual property? It’s actually easy to see why. Stan described that Intermountain has implemented or is currently working on ~150 decision support rules or modules. Given their organization’s budget and staff constraints he could see how those 150 could be expanded to 300 or so, but likely not more. That sounds great until you think that there could be 5000+ decision support rules or modules if there was enough time and budget.

The problem is that there was no path for Intermountain to go from 150 to 5000 decision support rules or modules on their own. The only way to get where they need to go is for everyone in healthcare to work together and share their findings and workflows.

Stan and the Healthcare Services Platform Consortium are building the framework for creating and sharing interoperable clinical decision support apps on the back of FHIR and Smart Apps. This diagram illustrates what they have in mind:
HSPC for 2015 Healthcare Transformation Assembly 151026
I think that Stan is spot on in his assessment of what needs to be done to get where we need to go with clinical decision support in health care. However, there are also plenty of reasons for being cautiously optimistic.

As Stan told us at the event, “If everyone says that their workflow is the only way, we won’t get very far.” Then Stan passionately argued for why physician independence allows the opportunity for doctors to take improper care of patients. “If we allow physicians to do whatever they want, we’re allowing them the right to take improper care of patients.”

Obviously Stan isn’t saying that there shouldn’t be rigorous debate about the best treatment. By putting these algorithms out to other organizations he’s actually inviting criticism and discussion of the work they’re doing. Plus, I have no doubt Stan understands where health care is an art and where it’s a science. However, I believe he rightly argues that where the science is clear, proclaiming the art of medicine is a poor excuse for doing something different.

In my mind, the Healthcare Services Platform Consortium should be focused on making the science of health care easily shareable and usable for all of health care regardless of EHR system. That’s a vision we should all get behind.

Speeding Sepsis Response by Integrating Key Technology

Posted on January 26, 2015 I Written By

Stephen Claypool, M.D., is Vice President of Clinical Development & Informatics, Clinical Solutions, with Wolters Kluwer Health and Medical Director of its Innovation Lab. He can be reached at steve.claypool@wolterskluwer.com.
Stephen Claypool - WKH
Three-week-old Jose Carlos Romero-Herrera was rushed to the ER, lethargic and unresponsive with a fever of 102.3. His mother watched helplessly as doctors, nurses, respiratory therapists and assorted other clinicians frantically worked to determine what was wrong with an infant who just 24 hours earlier had been healthy and happy.

Hours later, Jose was transferred to the PICU where his heart rate remained extremely high and his blood pressure dangerously low. He was intubated and on a ventilator. Seizures started. Blood, platelets, plasma, IVs, and multiple antibiotics were given. Still, Jose hovered near death.

CT scans, hourly blood draws and EEGs brought no answers. Despite all the data and knowledge available to the clinical team fighting for Jose’s life, it was two days before the word “sepsis” was uttered. By then, his tiny body was in septic shock. It had swelled to four times the normal size. The baby was switched from a ventilator to an oscillator. He received approximately 16 different IV antibiotics, along with platelets, blood, plasma, seizure medications and diuretics.

“My husband and I were overwhelmed at the equipment in the room for such a tiny little person. We were still in shock about how we’d just sat there and enjoyed him a few hours ago and now were being told that we may not be bringing him back home with us,” writes Jose’s mother, Edna, who shared the story of her baby’s 30-day ordeal as part of the Sepsis Alliance’s “Faces of Sepsis” series.

Jose ultimately survived. Many do not. Three-year-old Ivy Hayes went into septic shock and died after being sent home from the ER with antibiotics for a UTI. Larry Przybylski’s mother died just days after complaining of a “chill” that she suspected was nothing more than a 24-hour bug.

Sepsis is the body’s overwhelming, often-fatal immune response to infection. Worldwide, there are an estimated 8 million deaths from sepsis, including 750,000 in the U.S. At $20 billion annually, sepsis is the single most expensive condition treated in U.S. hospitals.

Hampering Efforts to Fight Sepsis

Two overarching issues hamper efforts to drive down sepsis mortality and severity rates.

First, awareness among the general population is surprisingly low. A recent study conducted by The Harris Poll on behalf of Sepsis Alliance found that just 44% of Americans had ever even heard of sepsis.

Second, the initial presentation of sepsis can be subtle and its common signs and symptoms are shared by multiple other illnesses. Therefore, along with clinical acumen, early detection requires the ability to integrate and track multiple data points from multiple sources—something many hospitals cannot deliver due to disparate systems and siloed data.

While the Sepsis Alliance focuses on awareness through campaigns including Faces of Sepsis and Sepsis Awareness Month, hospitals and health IT firms are focused on reducing rates by arming clinicians with the tools necessary to rapidly diagnose and treat sepsis at its earliest stages.

A primary clinical challenge is that sepsis escalates rapidly, leading to organ failure and septic shock, resulting in death in nearly 30 percent of patients. Every hour without treatment significantly raises the risk of death, yet early screening is problematic. Though much of the data needed to diagnose sepsis already reside within EHRs, most systems don’t have the necessary clinical decision support content or informatics functionality.

There are also workflow issues. Inadequate cross-shift communication, challenges in diagnosing sepsis in lower-acuity areas, limited financial resources and a lack of sepsis protocols and sepsis-specific quality metrics all contribute to this intractable issue.

Multiple Attack Points

Recognizing the need to attack sepsis from multiple angles, our company is testing a promising breakthrough in the form of POC Advisor™. The program is a holistic approach that integrates advanced technology with clinical change management to prevent the cascade of adverse events that occur when sepsis treatment is delayed.

This comprehensive platform is currently being piloted at Huntsville Hospital in Alabama and John Muir Medical Center in California. It works by leveraging EHR data and automated surveillance, clinical content and a rules engine driven by proprietary algorithms to begin the sepsis evaluation process. Mobile technology alerts clinical staff to evaluate potentially septic patients and determine a course of treatment based on their best clinical judgment.

For a truly comprehensive solution, it is necessary to evaluate specific needs at each hospital. That information is used to expand sepsis protocols and add rules, often hundreds of them, to improve sensitivity and specificity and reduce alert fatigue by assessing sepsis in complex clinical settings. These additional rules take into account comorbid medical conditions and medications that can cause lab abnormalities that may mimic sepsis. This helps to ensure alerts truly represent sepsis.

The quality of these alerts is crucial to clinical adoption. They must be both highly specific and highly sensitive in order to minimize alert fatigue. In the case of this specific system, a 95% specificity and sensitivity rating has been achieved by constructing hundreds of variations of sepsis rules. For example, completely different rules are run for patients with liver disease versus those with end-stage renal disease. Doing so ensures clinicians only get alerts that are helpful.

Alerts are also coupled with the best evidence-based recommendations so the clinical staff can decide which treatment path is most appropriate for a specific patient.

The Human Element

To address the human elements impacting sepsis rates, the system in place includes clinical change management to develop best practices, including provider education and screening tools and protocols for early sepsis detection. Enhanced data analytics further manage protocol compliance, public reporting requirements and real-time data reporting, which supports system-wide best practices and performance improvement.

At John Muir, the staff implemented POC Advisor within two medical/surgical units for patients with chronic kidney disease and for oncology patient populations. Four MEDITECH interfaces sent data to the platform, including lab results, pharmacy orders, Admit Discharge Transfer (ADT) and vitals/nursing documentation. A clinical database was created from these feeds, and rules were applied to create the appropriate alerts.

Nurses received alerts on a VoIP phone and then logged into the solution to review the specifics and determine whether they agree with the alerts based on their clinical training. The system prompted the nursing staff to respond to each one, either through acknowledgement or override. If acknowledged, suggested guidance regarding the appropriate next steps was provided, such as alerting the physician or ordering diagnostic lactate tests, based on the facility’s specific protocols. If alerts were overridden, a reason had to be entered, all of which were logged, monitored and reported. If action was not taken, repeat alerts were fired, typically within 10 minutes. If repeat alerts were not acted upon, they were escalated to supervising personnel.

Over the course of the pilot, the entire John Muir organization benefited from significant improvements on several fronts:

  • Nurses were able to see how data entered into the EHR was used to generate alerts
  • Data could be tracked to identify clinical process problems
  • Access to clinical data empowered the quality review team
  • Nurses reported being more comfortable communicating quickly with physicians based on guidance from the system and from John Muir’s standing policies

Finally, physicians reported higher confidence in the validity of information relayed to them by the nursing staff because they knew it was being communicated based on agreed upon protocols.

Within three months, John Muir experienced significant improvements related to key sepsis compliance rate metrics. These included an 80% compliance with patient screening protocols, 90% lactate tests ordered for patients who met screening criteria and 75% initiation of early, goal-directed therapy for patients with severe sepsis.

Early data from Huntsville Hospital is equally promising, including a 37% decline in mortality on patient floors where POC Advisor was implemented. Thirty-day readmissions have declined by 22% on screening floors, and data suggest documentation improvements resulting from the program may positively impact reimbursement levels.

This kind of immediate outcome is generating excitement at the pilot hospitals. Though greater data analysis is still necessary, early indications are that a multi-faceted approach to sepsis holds great promise for reducing deaths and severity.

Integrating Telemedicine And EMRs

Posted on May 17, 2013 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Have you considered what an EMR would look and feel like if it integrated telemedicine? Rashid Bashshur, director of telemedicine at the University of Michigan Health System, has given the idea a lot of thought.

In an interview with InformationWeek Healthcare, Bashshur tells IW’s Ken Terry that it’s critical to integrate HIEs, ACOs, Meaningful Use and electronic health records.

Makes sense in theory. How would it work?

To begin with, Bashshur said, healthcare providers who have virtual encounters with patients via a telehealth set-up should create an electronic health record for that patient.  The record could then be ported over to the patient’s PHR.  The physician can also share the health record via an HIE with other providers.

When providers attempt mobile and home monitoring, it steps the complexity up a notch, as such activities generate a large flow of data. The key, in this situation, is to use the EMR to sensitively filter incoming data.

Unfortunately, few EMRs today can easily pinpoint the information providers need to process, so most organizations have nurse care managers sift through incoming monitoring data. That’s the case at University of Michigan Health System, where care managers sift data manually to determine whether patients seem to be seeing changes in their conditions.

Unfortunately, even attentive care managers can’t catch everything a properly-designed system can, Bashshur notes.  To integrate EMRs and telemedicine/remote monitoring, it will be important for EMRs to have sophisticated filters in place which can pinpoint trouble spots in a patient’s condition, using a standard protocol which is applied uniformly.

According to InformationWeek, vendor eClinicalWorks has promised a new feature which can pick out relevant data from a large data stream. But until eCW or another EMR vendor produces such a feature, it seems that remote monitoring will be labor-intensive and expensive.

Analytics-Driven Compassionate Healthcare at El Camino Hospital

Posted on March 25, 2013 I Written By

Mandi Bishop is a hardcore health data geek with a Master's in English and a passion for big data analytics, which she brings to her role as Dell Health’s Analytics Solutions Lead. She fell in love with her PCjr at 9 when she learned to program in BASIC. Individual accountability zealot, patient engagement advocate, innovation lover and ceaseless dreamer. Relentless in pursuit of answers to the question: "How do we GET there from here?" More byte-sized commentary on Twitter: @MandiBPro.

Given its location in the heart of Silicon Valley, it may not be remarkable that El Camino Hospital was the first hospital in the US to implement EMR. What IS remarkable is that El Camino implemented EMR 51 years ago, leveraging an IBM mainframe system that Lockheed Martin refactored for healthcare from its original intended use for the space program.

Take a moment to process that. El Camino didn’t need PPACA, Meaningful Use, HITECH, or HIPAA to tell them health data is critical. El Camino saw the value in investing in healthcare IT for electronic data capture and communication without federal incentive programs or lobbyists. With that kind of track record of visionary leadership, it’s no wonder they became early analytics program adopters, and recently turned to Health Care DataWorks (HCD) as a trusted partner.

When I sat down with executive leadership from El Camino and HCD to discuss the journey up Tom Davenport‘s analytics maturity scale from rudimentary operational reporting to advanced analytics, I expected a familiar story of cost pressure, clinical informatics, quality measure incentives or alternative payment models as the business drivers for new insights development. Instead, I heard the burgeoning plan for a visionary approach to patient engagement and “analytics-driven compassionate care”.

Greg Walton, CIO of El Camino Hospital, admitted that initial efforts to implement an analytics program had resulted in “textbook errors”: “’Competing on Analytics’ was easier to write than execute,” he said. Their early efforts to adopt and conform to a commercially-available data model were hindered by the complexity of the solution and the philosophy of the vendor. “One of the messages I would give to anybody is: do NOT attempt this at home,” Greg laughed, and El Camino decided to change their approach. They sought a “different type of company…a real-life company with applicable lessons learned in this space.”

“The most important thing to remember in this sector: you’re investing in PEOPLE. This is a PEOPLE business,” Greg said. “And that if there’s any aspect of IT that’s the most people-oriented, it’s analytics. You have to triangulate between how much can the organization absorb, and how fast they can absorb it.” In HCD, El Camino found an analytics organization partner whose leadership and resources understand healthcare challenges first, and technology second.

To address El Camino’s need for aggregated data access across multiple operational systems, HCD is implementing their pioneering KnowledgeEdge Enterprise Data Warehouse solution,including its enterprise data model, analytic dashboards, applications and reports. HCD’s technology, implementation process, and culture is rooted in their deep clinical and provider industry expertise.

“The people (at HCD) have all worked in hospitals, and many still work there occasionally. Laypersons do not have the same understanding; HCD’s exposure to the healthcare provider environment and their level of experience provides a differentiator,” Greg explained. HCD impressed with their willingness to roll up their sleeves and work with the hospital stakeholders to address macro and micro program issues, from driving the evaluation and prioritization of analytics projects to identifying the business rules defining discharge destination. And both the programmers and staff are “thrilled,” Greg says: “My programmers are so happy, they think they’ve died and gone to heaven!”

This collaborative approach to adopting analytics as a catalyst for organizational and cultural change has lit a fire to address the plight of the patient using data as a critical tool. Greg expounded upon his vision to achieve what Aggie Haslup, Vice President of Marketing for HCD, termed “analytics-driven compassionate care”:

We need to change the culture about data without losing, and in fact enhancing, our culture around compassion. People get into healthcare because they’re passionate about compassion. Data can help us be more compassionate. US Healthcare Satisfaction scores have been basically flat over the last 10 years. Lots of organizations have tried to adopt other service industry tools: LEAN,6S; none of those address the plight of the patient. We’ve got to learn that we have to go back to our roots of compassion. We need to get back to the patient, which means “one who suffers in pain.” We want (to use data) to help understand more about person who’s suffering. My (recent) revelation: what do you do w/ guests in your house? Clean the house, put away the pets, get food, do everything you can to make guests comfortable. We want to know more about patients’ ethnicity, cultural heritage, the CONTEXT of their lives because when you’re in pain, what do you fall back on? Cultural values. We want a holistic view of the patient, because we can provide better, compassionate care through knowing more about patients. We want to deploy a contextual longitudinal view of the patient…and detect trends in satisfaction with demographics, clinical, medical data.

What a concept. Imagine the possibilities when a progressive healthcare provider teams with an innovative analytics provider to harness the power of data to better serve the patient population. I will definitely keep my eye on this pairing!

HIMSS Analytics Clinical & BI Maturity Model

Posted on March 14, 2013 I Written By

Mandi Bishop is a hardcore health data geek with a Master's in English and a passion for big data analytics, which she brings to her role as Dell Health’s Analytics Solutions Lead. She fell in love with her PCjr at 9 when she learned to program in BASIC. Individual accountability zealot, patient engagement advocate, innovation lover and ceaseless dreamer. Relentless in pursuit of answers to the question: "How do we GET there from here?" More byte-sized commentary on Twitter: @MandiBPro.

While the theme of HIMSS 2013 may have been, “How Great Is Interoperability,” the effectiveness of the many facets of interoperability are only as good as the actionable value of the shared data. The clinical insights that should be enabled by Meaningful Use Stage 2+ are expected to drive market trends in myriad areas of the healthcare system: chronic disease management, targeted member interventions, quality measures. In order to assess organizational readiness to capitalize on the promise of Meaningful Use, HIMSS Analytics began measuring the implementation and adoption of EMR and clinical documentation using a maturity model called EMRAM.

EMRAM

But, in analytics terms, EMRAM’s results are simply targeted foundational reporting, answering the question, “WHAT happened with Meaningful Use EMR adoption criteria.” So, you’ve got your clinical data in an EMR. Now what are you able to DO with it?

In 2013, HIMSS Analytics is taking a broader approach with the introduction of a new Clinical Business Intelligence maturity model, creating a framework to benchmark participating providers’ analytics maturity level.

I’ve been fortunate to know James Gaston, Senior Director of HIMSS Analytics Clinical & Business Intelligence, for many years, going back to his days with Arkansas Blue Cross. His appreciation for BI initiatives is matched only by his enthusiasm for the first day of turkey hunting season. When I ran into him at TDWI’s BI World summit in Orlando in November, he acted like a kid on Christmas morning, telling me about the brave new world of clinical data management that he was about to tackle. The excitement continued to build in the months leading up to HIMSS. James was practically glowing when we spoke about the upcoming C&BI Maturity Model release.

“Our customers are interested in not just understanding how to deploy IT applications, but how effectively they’re using those applications to support clinical business intelligence, as well as analytical pursuits,” James said. “So, HIMSS Analytics partnered with IIA to create and present a Clinical & BI Maturity Model that helps healthcare organizations measure that level of effectiveness.”

Sarah Gates, the VP of Research for IIA (the International Institute of Analytics), elaborated. “The HIMSS Analytics C&BI Maturity Model leverages the Competing on Analytics DELTA model, developed by Tom Davenport, which measures not only how well you’re using data and technology, but how well you’re building an analytical organization.” There are 5 core competency measurements in the DELTA model that will inform the HIMSS Analytics C&BI analysis: Data, Enterprise, Leadership, Targets, and Analysts. The methodology is holistic, touching on the cultural aspects of the organization as well as the technical, allowing a longitudinal view of the organization’s analytics program. A yardstick value from 1-5 will be assigned to each respondent based on Davenport’s criteria for each core competency.

Although HIMSS Analytics will eventually offer Level 1-5 certification program for those organizations with observed results for analytics, James and Sarah agreed that it is not appropriate for every provider to reach for the Level 5 gold star. Per Sarah, “Healthcare is an industry just starting to discover analytics. We’re expecting to see lots of practitioners that are emerging in use of analytics, so we believe it (survey results) will be heavy on the lower end of the maturity scale. Data warehouse capabilities and staffing career paths for data analysts will be key differentiators for mature programs.” Not all providers have the resources – financial, human, and/or technical – to attain advanced analytics nirvana, and James wants to insure that these providers don’t feel as if they’ve “failed”; the goal is to baseline against the peer group, identify opportunities for improvement, and focus on what is possible for each individual organization, working within their constraints.

What can we expect to see at next year’s C&BI survey results presentation? James said, “We want to be able to talk about benchmarking the industry as a whole, helping healthcare find its way with clinical business intelligence and begin to understand how important it is, and where opportunities lie Everyone’s talking about clinical and BI – it is the opportunity to realize savings in healthcare, to use information to empower people to make better decisions.”

So, it’s up to you, providers and technology partners. You’ve implemented your EMR, achieved a high adoption rate across your organization’s core clinical processes, attested to Meaningful Use Stage 2, achieved Stage 7 on the HIMSS EMRAM scale, perhaps even participated in multi-HIE CCD medical records sharing with other provider networks. You’ve got the data in-house and availabe. It’s time to see how ready you are to rise to the analytics challenge and maximize your return on those EMR and HIE investments.

Attempt to beat your previous Doug Fridsma long jump.

Note: for the complete HIMSS 2013 Leadership Survey Results, please download PDF here.

What Would ONC’s Dr. Doug Fridsma Do? (THIS Geek Girl’s Guide to HIMSS)

Posted on March 2, 2013 I Written By

Mandi Bishop is a hardcore health data geek with a Master's in English and a passion for big data analytics, which she brings to her role as Dell Health’s Analytics Solutions Lead. She fell in love with her PCjr at 9 when she learned to program in BASIC. Individual accountability zealot, patient engagement advocate, innovation lover and ceaseless dreamer. Relentless in pursuit of answers to the question: "How do we GET there from here?" More byte-sized commentary on Twitter: @MandiBPro.

I know you’ve all been wondering how I’m planning to spend my mad crazy week at HIMSS in New Orleans. Well, maybe not ALL of you, but perhaps at least one – who is most likely my blog boss, the master John Lynn. Given the array of exciting developments in healthcare IT across the spectrum, from mobile and telehealth to wearable vital sign monitoring devices, EMR consolidation to cloud-based analytics platforms, it’s been extraordinarily difficult to keep myself from acting like Dori in “Finding Nemo”: “Oooooh! Shiny!” I’ve had to remind myself daily that I will have an opportunity to play with everything that catches my eye, but that I am only qualified to write and speak intelligently on my particular areas of expertise. And so, I’m proud to say I’ve finally solidified my agenda for the entire week, and I cannot WAIT to go ubergeek fan girl on so many industry luminaries and fascinating up-and-comers making great strides towards interoperability, deriving the “meaning” in “Meaningful Use” from clinical data, and leveraging the power of big data analytics to improve quality of patient experience and outcomes.

On Sunday, I’m setting the stage for the rest of the week with a sit-down with ONC’s Director of Standards and Interoperability and Acting Chief Scientist, Dr. Doug Fridsma. His groundbreaking work in interoperability spans multiple initiatives, including: the Nationwide Health Information Network (NwHIN) and the CONNECT project, as well as the Federal Health Architecture. For insight into his passion for transforming the healthcare system through health IT, check out his blog: From The Desk of the Chief Science Officer.

Through the rest of the week, I aspire to see the world through Dr. Fridsma’s eyes, focusing on how each of the organizations and individuals contribute to the standards-based processes and policies that form the foundation for actionable analytics – and improved health. I’ve selected interviews with key visionaries from companies large and small, who I feel are representative of positive forward movement:

Health Care DataWorks piques my interest as an up-and-comer to watch, empowering healthcare systems to improve outcomes and reduce medical costs by providing accelerated EDW design and implementation, whether on-premise or via SaaS solution. Embedded industry analytics models supporting alternative network models, population-based payment models, and value-based purchasing allow for rapid realization of positive ROI.

Emdeon, is the single largest clinical, financial, and administrative network, connecting over 400,000 providers and executing more than seven billion health exchanges annually. And if that’s not enough to attract keen attention, they recently announced a partnership with Atigeo to provide intelligent analytics solutions with Emdeon’s PETABYTES of data.

Serving an area near and dear to my heart, Clinovations provides healthcare management consulting services to stakeholders at each link in the chain, from providers to payers and supporting trading partners – in areas from EMR implementation (and requisite clinical data standards) to market and vendor assessments, and data management activities throughout. With the dearth in qualified SME resources in the clinical data field, I look forward to learning about how Clinovations plans to manage their growth and retain key talent.

Who doesn’t love a great legacy decommissioning story? Mediquant proports adopting their DataArk product can result in an 80% reduction in legacy system costs through increased interoperability across disparate source systems and consolidated access. The “active archiving” solution allows for a centralized repository and consolidated accounting functions out of legacy data without continuing to operate (and support) the legacy system. Longitudinal clinical records? Yes, please!

Those are just a few on my must-see list, and I think Dr. Doug Fridsma would be proud of their vision, and find alignment to his ONC program goals. But will he be proud of their execution?

Can’t wait to find out, on the exhibit hall floor – and in the hallway conversations, and the client case study sessions, and the general scuttlebutt – at HIMSS!