Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

Karen DeSalvo and Jacob Reider Leave ONC

Posted on October 24, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 15 blogs containing almost 6000 articles with John having written over 3000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 13 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

It’s been a tumultuous few months for ONC and it’s just gotten even more tumultuous. We previously reported about the departures of Doug Fridsma MD, ONC’s Chief Science Officer, Joy Pritts, the first Chief Privacy Officer at ONC, and Lygeia Ricciardi, Director of the Office of Consumer eHealth, and Judy Murphy, Chief Nursing Officer (CNO) from ONC. Yesterday, the news dropped that Karen DeSalvo, ONC’s National Coordinator, and Jacob Reider, ONC’s Deputy National Coordinator, are both leaving ONC as well.

Karen DeSalvo has been tapped by HHS Secretary Sylvia Mathews Burwell to replace Wanda K. Jones as assistant secretary of health which oversees the surgeon general’s office and will be working on Ebola and other pressing health issues. I think DeSalvo’s letter to staff describes it well:

As you know, I have deep roots and a belief in public health and its critical value in assuring the health of everyone, not only in crisis, but every day, and I am honored to be asked to step in to serve.

DeSalvo’s always been a major public health advocate and that’s where her passion lies. Her passion isn’t healthcare technology. So, this change isn’t surprising. Although, it is a little surprising that it comes only 10 months into her time at ONC.

The obvious choice as Acting National Coordinator would have been Jacob Reider who was previously Acting National Coordinator when Farzad Mostashari left. However, Reider also announced his decision to leave ONC:

In light of the events that led to Karen’s announcement today–it’s appropriate now to be clear about my plans, as well. With Jon White and Andy Gettinger on board, and a search for a new Deputy National Coordinator well underway, I am pleased that much of this has now fallen into place–with only a few loose ends yet to be completed. I’ll remain at ONC until late November, working closely with Lisa as she assumes her role as Acting National Coordinator.

As Reider mentions, Lisa Lewis who is currently ONC’s COO will be serving as Acting National Coordinator at ONC.

What’s All This Mean?
There’s a lot of speculation as to why all of these departures are happening at ONC. Many people believe that ONC is a sinking ship and people are doing everything they can to get off the ship before it sinks completely. Others have suggested that these people see an opportunity to make a lot more money working for a company. The government certainly doesn’t pay market wages for the skills these people have. Plus, their connections and experience at ONC give them some unique qualifications that many companies are willing to pay to get. Some have suggested that the meaningful use work is mostly done and so these people want to move on to something new.

My guess is that it’s a mix of all of these things. It’s always hard to make broad generalizations about topics like this. For example, I already alluded to the fact that I think Karen DeSalvo saw an opportunity to move to a position that was more in line with her passions. Hard to fault someone for making that move. We’d all do the same.

What is really unclear is the future of ONC. They still have a few years of meaningful use which they’ll have to administer including the EHR penalties which could carry meaningful use forward for even longer than just a few years. I expect ONC will still have money to work on things like interoperability. We’ll see if ONC can put together the patient safety initiative they started or if that will get shut down because it’s outside their jurisdiction.

Beyond those things, what’s the future of ONC?

The Medication List Said, “Raised toilet seat daily”

Posted on September 25, 2014 I Written By

The following is a guest blog post by Lisa Pike, CEO of Versio.
???????????????????????????????
With over a third of healthcare organizations switching to a new EHR in 2014, there is a lot of data movement going on. With the vast amount of effort it took to create that data, it’s a valuable asset to the organization. It can mean life or death; it can keep a hospital out of the courtroom; and it can mean the difference between a smooth-running organization and an operational nightmare.

But when that important data needs to be converted and moved to a new EHR, you realize just how complex it really is.

During a recent conversion of legacy data over to a new EHR, we came across this entry in the Medication List:  Raised toilet seat, daily.

Uh, come again??

How about this one?  “Dignity Plus XXL [adult diapers]; take one by mouth daily.”  What does the patient have, potty mouth?

Now, while we may snicker at the visual, it’s really no joke. These are actual entries encountered in source systems during clinical data migration projects. Some entries are comical; some are just odd; and some are downright frightening. But all of them are a conversion nightmare when you are migrating data.

Patient clinical data is unlike any other kind of data, for many reasons. It’s massive. It requires near-perfect accuracy. It’s also extremely complex, especially when you are not just migrating, but also converting from one system “language” to another.

Automated conversion is a common choice for healthcare organizations when moving data from legacy systems to newly adopted EHRs. It can be a great choice for some of the data, but not all. If your source says “hypertension, uncontrolled,” but your target system only has “uncontrolled hypertension,” that’s a simple enough inconsistency to overcome, but how would you predict every non-standard or incorrect entry you will encounter?

Here are some more actual examples. If you’re considering automated conversion, consider how your software would tangle up over these:

SOURCE SYSTEM SAYS COMMENTS
346.71D  Chm gr wo ara w nt wo st ???
levothyroxine 100 mg Should be mcg. Yikes!
Proventil Target system has 20 choices
NKDA (vomiting) NKDA= no known drug allergies.
Having no allergies causes vomiting?
Massage Therapy, take one by mouth twice weekly ???
Tylenol suppositories; take 1 by mouth daily Maybe not life-threatening, but certainly unpleasant
PMD
(Pelizaeus-Merzbacher disease)
Should have been PMDD
(premenstrual dysphoric disorder)
Allergy:  Reglan 5 mg Is patient allergic only to that dosage, or should this have been in the med list?
Confusing allergies and meds can be deadly.
Height 60 Centimeters or inches? Convert carefully!

 

These just scratch the surface of the myriad complexities, entry errors, and inconsistencies that exist in medical records across the industry. No matter how diligent your staff is, I guarantee your charts contain entries like these!

When an automated conversion program encounters data it can’t convert, it falls out as an “exception.” If the exception can’t be resolved, the data is simply left behind. Even with admirable effort, almost no one in the industry can capture more than 80% of the data. Some report as low as 50%.

How safe would you feel if your doctor didn’t know about 20% of your allergies? What if one of those left behind was the one that could kill you? What if a medication left behind was one you absolutely shouldn’t take with a new medication your doctor prescribed? Consider the woman whose aneurysm history was omitted during a conversion to a new EHR, so her specialist was unaware of it. She later died during a procedure when her aneurysm burst. I would say her family considered that data left behind pretty important, as did the treating physician, who could be found liable.

Liable, you say?

That’s right. The specialist could be found liable for the information in the legacy record because it was available….even if it was archived in an old EHR or paper chart.

You can begin to see the enormity of the problem and the potentially dangerous ramifications. Certainly every patient deserves an accurate record, and healthcare providers’ effectiveness, if not their very livelihood, depends on it. But maintaining the integrity of the data, especially during an EHR conversion, is no trivial task. Unfortunately, too many healthcare organizations underestimate it, and clearly it deserves more attention.

There is good news, however. With a well-planned conversion, using a system that combines robust technology with human expertise, it is possible to achieve 100% data capture with 99.8% accuracy. We’ve done it with well over a million patient chartsIt isn’t easy, but the results are worth it. Patients and doctors deserve no less.

Lisa Pike is the CEO of Versio, a healthcare technology company specializing in legacy data migration, with a proven track record of 100% data capture and 99.8% quality. We call it “No Data Left Behind.” For more information on Versio’s services or to schedule an introductory conversation, please visit us at www.MyVersio.com or email sales@myversio.com.

Comprehensive Patient View, Social Media Time, and Linking Millions of EMR

Posted on August 10, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 15 blogs containing almost 6000 articles with John having written over 3000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 13 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.


You don’t really need to click on the link above. The answer is no. The answer is that it probably won’t ever happen. There are just too many source systems where our health data is stored and it’s getting more complicated, not less.


If the social media maven Mandi has a challenge getting her social media on, now you can understand why many others “don’t have the time.” It takes a commitment and many don’t want to make that commitment. It doesn’t make them bad people. We all only have so many hours in a day.


No need to read this link either. Although, I found it great that they described the challenge as linking millions of EMR. Let’s be generous and say there are 700 EHR vendors. Unfortunately, that doesn’t describe what it takes to make EMR interoperable. To use a cliche phrase, if you’ve connected with one Epic installation, you’ve connected with one Epic installation. I know it’s getting better, but it’s not there. If you want interoperable EMR data, you need to connect a lot of different installs.

Eyes Wide Shut – Patient Engagement Pitfalls Prior to Meaningful Use Reporting Period

Posted on June 30, 2014 I Written By

Mandi Bishop is a healthcare IT consultant and a hardcore data geek with a Master's in English and a passion for big data analytics, who fell in love with her PCjr at 9 when she learned to program in BASIC. Individual accountability zealot, patient engagement advocate, innovation lover and ceaseless dreamer. Relentless in pursuit of answers to the question: "How do we GET there from here?" More byte-sized commentary on Twitter: @MandiBPro.

July 1, 2015 – the start of the Meaningful Use Stage 1 Year 2 reporting period for the hospital facilities within this provider integrated delivery network (IDN). The day the 50% online access measure gets real. The day the inpatient summary CCDA MUST be made available online within 36 hours of discharge. The day we must overcome a steady 65% patient portal decline rate.

A quick recap for those who haven’t followed this series (and refresher for those who have): this IDN has multiple hospital facilities, primary care, and specialty practices, on disparate EMRs, all connecting to an HIE and one enterprise patient portal. There are 8 primary EMRs and more than 20 distinct patient identification (MRN) pools. And many entities within this IDN are attempting to attest to Meaningful Use Stage 2 this year.

For the purposes of this post, I’m ignoring CMS and the ONC’s new proposed rule that would, if adopted, allow entities to attest to Meaningful Use Stage 1 OR 2 measures, using 2011 OR 2014 CEHRT (or some combination thereof). Even if the proposed rule were sensible, it came too late for the hospitals which must start their reporting period in the third calendar quarter of 2014 in order to complete before the start of the fiscal year on October 1. For this IDN, the proposed rule isn’t changing anything.

Believe me, I would have welcomed change.

The purpose of the so-called “patient engagement” core measures is just that: engage patients in their healthcare, and liberate the data so that patients are empowered to have meaningful conversations with their providers, and to make informed health decisions. The intent is a good one. The result of releasing the EMR’s compilation of chart data to recently-discharged patients may not be.

I answered the phone on a Saturday, while standing in the middle of a shopping mall with my 12 year-old daughter, to discover a distraught man and one of my help desk representatives on the line. The man’s wife had been recently released from the hospital; they had been provided patient portal access to receive and review her records, and they were bewildered by the information given. The medications listed on the document were not the same as those his wife regularly takes, the lab section did not have any context provided for why the tests were ordered or what the results mean, there were a number of lab results missing that he knew had been performed, and the problems list did not seem to have any correlation to the diagnoses provided for the encounter.

Just the kind of call an IT geek wants to receive.

How do you explain to an 84 year-old man that his wife’s inpatient summary record contains only a snapshot of the information that was captured during that specific hospital encounter, by resources at each point in the patient experience, with widely-varied roles and educational backgrounds, with varied attention to detail, and only a vague awareness of how that information would then be pulled together and presented by technology that was built to meet the bare minimum standards for perfect-world test scenarios required by government mandates?

How do you tell him that the lab results are only what was available at time of discharge, not the pathology reports that had to be sent out for analysis and would not come back in time to meet the 36-hour deadline?

How do you tell him that the reasons there are so many discrepancies between what he sees on the document and what is available on the full chart are data entry errors, new workflow processes that have not yet been widely adopted by each member of the care team, and technical differences between EMRs in the interpretation of the IHE’s XML standards for how these CCDA documents were to be created?

EMR vendors have responded to that last question with, “If you use our tethered portal, you won’t have that problem. Our portal can present the data from our CCDA just fine.” But this doesn’t take into account the patient experience. As a consumer, I ask you: would you use online banking if you had to sign on to a different website, with a different username and password, for each account within the same bank? Why should it be acceptable for managing health information online to be less convenient than managing financial information?

How do hospital clinical and IT staff navigate this increasingly-frequent scenario that is occurring: explaining the data that patients now see?

I’m working hard to establish a clear delineation between answering technical and clinical questions, because I am not – by any stretch of the imagination – a clinician. I can explain deviations in the records presentation, I can explain the data that is and is not available – and why (which is NOT generally well-received), and I can explain the logical processes for patients to get their clinical questions answered.

Solving the other half of this equation – clinicians who understand the technical nuances which have become patient-facing, and who incorporate that knowledge into regular patient engagement to insure patients understand the limitations of their newly-liberated data – proves more challenging. In order to engage patients in the way the CMS Meaningful Use program mandates, have we effectively created a new hybrid role requirement for our healthcare providers?

And what fresh new hell have we created for some patients who seek wisdom from all this information they’ve been given?

Caveat – if you’re reading this, it’s likely you’re not the kind of patient who needs much explaining. You’re likely to do your own research on the data that’s presented on your CCDA outputs, and you have the context of the entire Meaningful Use initiative to understand why information is presented the way it is. But think – can your grandma read it and understand it on HER own?

Not So Open: Redefining Goals for Sharing Health Data in Research

Posted on June 24, 2014 I Written By

The following is a guest blog post by Andy Oram, writer and editor at O’Reilly Media.

One couldn’t come away with more enthusiasm for open data than at this month’s Health Datapalooza, the largest conference focused on using data in health care. The whole 2000-strong conference unfolds from the simple concept that releasing data publicly can lead to wonderful things, like discovering new cancer drugs or intervening with patients before they have to go to the emergency room.

But look more closely at the health care field, and open data is far from the norm. The demonstrated benefits of open data sets in other fields–they permit innovation from any corner and are easy to combine or “mash up” to uncover new relationships–may turn into risks in health care. There may be better ways to share data.

Let’s momentarily leave the heady atmosphere of the Datapalooza and take a subway a few stops downtown to the Health Privacy Summit, where fine points of patient consent, deidentification, and the data map of health information exchange were discussed the following day. Participants here agree that highly sensitive information is traveling far and wide for marketing purposes, and perhaps even for more nefarious uses to uncover patient secrets and discriminate against them.

In addition to outright breaches–which seem to be reported at least once a week now, and can involve thousands of patients in one fell swoop–data is shared in many ways that arguably should be up to patients to decide. It flows from hospitals, doctors, and pharmacies to health information exchanges, researchers in both academia and business, marketers, and others.

Debate has raged for years between those who trust deidentification and those who claim that reidentification is too easy. This is not an arcane technicality–the whole industry of analytics represented at the Datapalooza rests on the result. Those who defend deidentification tend to be researchers in health care and the institutions who use their results. In contrast, many computer scientists outside the health care field cite instances where people have been reidentified, usually by combining data from various public sources.

Latanya Sweeney of Harvard and MIT, who won a privacy award this year at the summit, can be credited both with a historic reidentification of the records of Massachusetts Governor William Weld in 1997 and a more recent exposé of state practices. The first research led to the current HIPAA regime for deidentification, while the second showed that states had not learned the lessons of anonymization. No successful reidentifications have been reported against data sets that use recommended deidentification techniques.

I am somewhat perplexed by the disagreement, but have concluded that it cannot be resolved on technical grounds. Those who look at the current state of reidentification are satisfied that health data can be secured. Those who look toward an unspecified future with improved algorithms find reasons to worry. In a summit lunchtime keynote, Adam Tanner reported his own efforts as a non-expert to identify people online–a fascinating and sometimes amusing tale he has written up in a new book, What Stays in Vegas. So deidentification is like encryption–we all use encryption even though we expect that future computers will be able to break current techniques.

But another approach has flown up from the ashes of the “privacy is dead” nay-sayers: regulating the use of data instead of its collection and dissemination. This has been around for years, most recently in a federal PCAST report on big data privacy. One of the authors of that report, Craig Mundie of Microsoft, also published an article with that argument in the March/April issue of Foreign Affairs.

A simple application of this doctrine in health care is the Genetic Information Nondiscrimination Act of 2008. A more nuanced interpretation of the doctrine could let each individual determine who gets to use his or her data, and for what purpose.

Several proposals have been aired to make it easier for patients to grant blanket permission for certain data uses, one proposal being “patient privacy bundles” in a recent report commissioned by AHRQ. Many people look forward to economies of data, where patients can make money by selling data (how much is my blood pressure reading worth to you)?

Medyear treats personal health data like Twitter feeds, letting you control the dissemination of individual data fields through hash tags. You could choose to share certain data with your family, some with your professional care team, and some with members of your patient advocacy network. This offers an alternative to using services such as PatientsLikeMe, which use participants’ data behind the scenes.

Open data can be simulated by semi-open data sets that researchers can use under license, as with the Genetic Association Information Network that controls the Database of Genotypes and Phenotypes (dbGaP). Many CMS data sets are actually not totally open, but require a license to use.

And many data owners create relationships with third-party developers that allow them access to data. Thus, the More Disruption Please program run by athenahealth allows third-party developers to write apps accessing patient data through an API, once the developers sign a nondisclosure agreement and a Code of Conduct promising to use the data for legitimate purposes and respect privacy. These apps can then be offered to athenahealth’s clinician clients to extend the system’s capabilities.

Some speakers went even farther at the Datapalooza, asking whether raw data needs to be shared at all. Adriana Lukas of London Quantified Self and Stephen Friend of Sage Bionetworks suggested that patients hold on to all their data and share just “meanings” or “methods” they’ve found useful. The future of health analytics, it seems to me, will use relatively few open data sets, and lots of data obtained through patient consent or under license.

Understanding Apple Health

Posted on June 17, 2014 I Written By

Kyle is Founder and CEO of Pristine, a company in Austin, TX that develops telehealth communication tools optimized for Google Glass in healthcare environments. Prior to founding Pristine, Kyle spent years developing, selling, and implementing electronic medical records (EMRs) into hospitals. He also writes for EMR and HIPAA, TechZulu, and Svbtle about the intersections of healthcare, technology, and business. All of his writing is reproduced at kylesamani.com

Apple recently announced Health and Healthkit as part of iOS 8, and initial responses have been mixed.

At one extreme, the (highly biased) CEO of Mayo Clinic called Apple Health “revolutionary.” At the other, cynical health IT pundits claim that Apple Health is a consumer novelty and won’t crack the enigmatic healthcare system. As a cynical health IT pundit myself, I’m more inclined towards the latter, but have some optimism about Apple’s first steps into healthcare.

For the uninitiated, Apple Health is a central dashboard for health related information, packaged for consumers as an iOS app. Consumers open the app and see a broad array of clinical indicators (e.g. as physical activity, blood pressure, blood glucose, sleep data). You can learn more about Health and Healthkit from Apple.

The rest of this post assumes significant understanding of modern health IT challenges such as data silos, EMPIs, HIEs, and an understanding of what Health and Healthkit can and can’t do. I’ll address what Apple Health does well, ask some questions, and then provide some commentary.

Apple Health does a few things well:

1) Apple Health acts as a central dashboard for consumers. Rather than switching between five different apps, Health provides a central view of all clinical indicators. In time, Health could help patients understand the nuances of their own data. By removing friction to seeing a variety of indicators in a single view, patients may discover correlations that they wouldn’t have observed before. With that information, consumers should be able to adjust behaviors to lead healthier lifestyles.

2) Apple Health provides a robust mechanism for health apps to share data with one another. Until now, health app developers needed to form partnerships with one another and develop custom code to share information; now they can do this in a standardized way with minimal technical or administrative overhead. This reduces app lock-in by enabling data liquidity, empowering consumers to switch to the best health app or device and carry data between apps. This is a big win for consumers.

Unanswered questions:

1) How does Apple Health actually work? Apple provided virtually no details. Does the patient need the Epic MyChart app on their phone? Is there custom code integrating iOS to Epic MyChart? Is there a Mayo Clinic app that is separate from Epic MyChart? If not, how does Apple Health know that the consumer is a Mayo patient? Or a Kaiser Permanente patient? Or a Sutter Health patient?

2) Does the patient give consent per data value, or is it all or nothing? How long does consent last? Must consent be taken at the hospital, or can the patient opt in or out any time on their phone? Who within the health system can access the consented data?

3) Given that there are hundreds of EpicCare silos and dozens of CareEverywhere silos, how does Apple Health decide which silo(s) to interface with? Does data go to an HIE or to an EMR? If to an HIE, can all eligible connected providers access the data with consent? If a patient has records in multiple HIEs and EMRs (which they likely do), how does Apple Health determine which HIE(s) to push and pull data from?

4) Does Apple Health support non-numerical data such as CCDAs? What about unstandardized data? For example, PatientIO allows providers to develop customized care plans for patients that can include almost any behavioral prescription. Examples include water intake, exercising at a certain time of the day, taper schedules, etc.

5) Can providers write back to a patient’s Health profile? Given that open.epic doesn’t allow Epic to send data out, how could Apple Health receive data from Epic?

7) How will Apple handle competing health apps installed on the same consumer’s phone? For example, if I tap “more diabetes info” in Apple Health, will it open Mayo Clinic’s app (and if so, to the right place in the Mayo Clinic app?) or the blood glucose tracking app that came with with my blood glucose meter? Or my iTriage or WebMD app?

8) Is Apple Health intended to function as a patient-centric HIE? If so, what standards does it support? CCDA? FHIR? Direct?

Comments:

1) The Apple-Epic partnership is obviously built on open.epic, which Epic announced in September of 2013. It’s likely that Apple and Epic reached an agreement around that time, and asked the public for ideas on how to shape the program to get a sense of what developers wanted.

2) The only way to succeed in health IT is to force the industry to conform to one’s standards, or to support a hybrid of hybrids approach. Early indicators show Apple (predictably) trending toward the former. Unfortunately, Apple’s perennially Apple-centric approach inhibits supporting the level of interoperability necessary to power an effective consumer health strategy. Although Apple provides a great foundation for some basic functions, the long term potential based on the current offering is limited. What Apple has produced to date provides for sexy screenshots, but appears to fall short of addressing the core interoperability and connectivity issues that plague chronic disease management and coordination of care.

3) In a hypothetical world at some indeterminate point in the future, there would be a patient-facing, DNS-like lookup system for provider organizations (Direct eventually?). Patients should be able to lookup provider organizations and share their data with providers selectively. Apple Health provides a great first step towards that dream world by empowering patients to see and, to some extent, control their own data.

Population Health Management and Business Process Management

Posted on June 13, 2014 I Written By

Chuck Webster, MD, MSIE, MSIS has degrees in Accountancy, Industrial Engineering, Intelligent Systems, and Medicine (from the University of Chicago). He designed the first undergraduate program in medical informatics, was a software architect in a hospital MIS department, and also VP and CMIO for an EHR vendor for over a decade. Dr. Webster helped three healthcare organizations win the HIMSS Davies Award and is a judge for the annual Workflow Management Coalition Awards for Excellence in BPM and Workflow and Awards for Case Management. Chuck is a ceaseless evangelist for process-aware technologies in healthcare, including workflow management systems, Business Process Management, and dynamic and adaptive case management. Dr. Webster tweets from @wareFLO and maintains numerous websites, including EHR Workflow Management Systems (http://chuckwebster.com), Healthcare Business Process Management (http://HCBPM.com) and the People and Organizations improving Healthcare with Health Information Technology (http://EHRworkflow.com). Please join with Chuck to spread the message: Viva la workflow!

This is my fifth and final of five guest blog posts covering Health IT and EHR Workflow.

Way back in 2009 I penned a research paper with a long and complicated title that could also have been, simply, Population Health Management and Business Process Management. In 2010 I presented it at MedInfo10 in Cape Town, Africa. Check out my travelogue!

Since then, some of what I wrote has become reality, and much of the rest is on the way. Before I dive into the weeds, let me set the stage. The Affordable Care Act added tens of millions of new patients to an already creaky and dysfunctional healthcare and health IT system. Accountable Care Organizations were conceived as virtual enterprises to be paid to manage the clinical outcome and costs of care of specific populations of individuals. Population Health Management has become the dominant conceptual framework for proceeding.

I looked at a bunch of definitions of population health management and created the following as a synthesis: “Proactive management of clinical and financial risks of a defined patient group to improve clinical outcomes and reduce cost via targeted, coordinated engagement of providers and patients across all care settings.”

You can see obvious places in this definition to apply trendy SMAC tech — social, mobile, analytics, and cloud — social, patient settings; mobile, provider and patient settings; analytics, cost and outcomes; cloud, across settings. But here I want to focus on the “targeted, coordinated.” Increasingly, it is self-developed and vendor-supplied care coordination platforms that target and coordinate, filling a gap between EHRs and day-to-day provider and patient workflows.

The best technology on which, from which, to create care coordination platforms is workflow technology, AKA business process management and adaptive/dynamic case management software. In fact, when I drill down on most sophisticated, scalable population health management and care coordination solutions, I usually find a combination of a couple things. Either the health IT organization or vendor is, in essence, reinventing the workflow tech wheel, or they embed or build on third-party BPM technology.

Let me direct you to my section Patient Class Event Hierarchy Intermediates Patient Event Stream and Automated Workflow in that MedInfo10 paper. First of all you have to target the right patients for intervention. Increasingly, ideas from Complex Event Processing are used to quickly and appropriately react to patient events. A Patient Class Event Hierarchy is a decision tree mediating between low-level events (patient state changes) and higher-level concepts clinical concepts such as “on-protocol,” “compliant”, “measured”, and “controlled.”

Examples include patients who aren’t on protocol but should be, aren’t being measured but should be, or whose clinical values are not controlled. Execution of appropriate automatic policy-based workflows (in effect, intervention plans) moves patients from off-protocol to on-protocol, non-compliance to compliance, unmeasured to measured, and from uncontrolled to controlled state categories.

Population health management and care coordination products and services may use different categories, terminology, etc. But they all tend to focus on sensing and reacting to untoward changes in patient state. But simply detecting these changes is insufficient. These systems need to cause actions. And these actions need to be monitored, managed, and improved, all of which are classic sterling qualities of business process management software systems and suites.

I’m reminded of several tweets about Accountable Care Organization IT systems I display during presentations. One summarizes an article about ACOs. The other paraphrases an ACO expert speaking at a conference. The former says ACOs must tie together many disparate IT systems. The later says ACOs boil down to lists: actionable lists of items delivered to the right person at the right time. If you put these requirements together with system-wide care pathways delivered safely and conveniently to the point of care, you get my three previous blog posts on interoperability, usability, and safety.

I’ll close here with my seven advantages of BPM-based care coordination technology. It…

  • More granularly distinguishes workflow steps
  • Captures more meaningful time-stamped task data
  • More actively influences point-of-care workflow
  • Helps model and understand workflow
  • Better coordinates patient care task handoffs
  • Monitors patient care task execution in real-time
  • Systematically improves workflow effectiveness & efficiency

Distinguishing among workflow steps is important to collecting data about which steps provide value to providers and patients, as well as time-stamps necessary to estimate true costs. Further, since these steps are executed, or at least monitored, at the point-of-care, there’s more opportunity to facilitate and influence at the point-of-care. Modeling workflow contributes to understanding workflow, in my view an intrinsically valuable state of affairs. These workflow models can represent and compensate for interruptions to necessary care task handoffs. During workflow execution, “enactment” in BPM parlance, workflow state is made transparently visible. Finally, workflow data “exhaust” (particularly times-stamped evidence-based process maps) can be used to systematically find bottlenecks and plug care gaps.

In light of the fit between complex event processing detecting changes in patient state, and BPM’s automated, managed workflow at the point-of-care, I see no alternative to what I predicted in 2010. Regardless of whether it’s rebranded as care or healthcare process management, business process management is the most mature, practical, and scalable way to create the care coordination and population health management IT systems required by Accountable Care Organizations and the Affordable Care Act. A bit dramatically, I’d even say business process management’s royal road to healthcare runs through care coordination.

This was my fifth and final blog post in this series on healthcare and workflow technology solicited by John Lynn for this week that he’s on vacation. Here was the outline:

If you missed one of my previous posts, I hope you’ll still check it out. Finally, thank you John, for allowing to me temporarily share your bully pulpit.


Patient Safety And Process-Aware Information Systems: Interruptions, Interruptions, Interruptions!

Posted on June 12, 2014 I Written By

Chuck Webster, MD, MSIE, MSIS has degrees in Accountancy, Industrial Engineering, Intelligent Systems, and Medicine (from the University of Chicago). He designed the first undergraduate program in medical informatics, was a software architect in a hospital MIS department, and also VP and CMIO for an EHR vendor for over a decade. Dr. Webster helped three healthcare organizations win the HIMSS Davies Award and is a judge for the annual Workflow Management Coalition Awards for Excellence in BPM and Workflow and Awards for Case Management. Chuck is a ceaseless evangelist for process-aware technologies in healthcare, including workflow management systems, Business Process Management, and dynamic and adaptive case management. Dr. Webster tweets from @wareFLO and maintains numerous websites, including EHR Workflow Management Systems (http://chuckwebster.com), Healthcare Business Process Management (http://HCBPM.com) and the People and Organizations improving Healthcare with Health Information Technology (http://EHRworkflow.com). Please join with Chuck to spread the message: Viva la workflow!

This is my fourth of five guest blog posts covering Health IT and EHR Workflow.

When you took a drivers education class, do you remember the importance of mental “awareness” to traffic safety? Continually monitor your environment, your car, and yourself. As in traffic flow, healthcare is full of work flow, and awareness of workflow is the key to patient safety.

First of all, the very act of creating a model of work to be done forces designers and users to very carefully think about and work through workflow “happy paths” and what to do when they’re fallen off. A happy path is a sequence of events that’s intended to happen, and, if all goes well, actually does happen most of the time. Departures from the Happy Path are called “exceptions” in computer programming parlance. Exceptions are “thrown”, “caught”, and “handled.” At the level of computer programming, an exception may occur when data is requested from a network resource, but the network is down. At the level of workflow, an exception might be a patient no-show, an abnormal lab value, or suddenly being called away by an emergency or higher priority circumstance.

Developing a model of work, variously called workflow/process definition or work plan forces workflow designers and workflow users to communicate at a level of abstraction that is much more natural and productive than either computer code or screen mockups.

Once a workflow model is created, it can be automatically analyzed for completeness and consistency. Similar to how a compiler can detect problems in code before it’s released, problems in workflow can be prevented. This sort of formal analysis is in its infancy, and is perhaps most advanced in healthcare in the design of medical devices.

When workflow engines execute models of work, work is performed. If this work would have otherwise necessarily been accomplished by humans, user workload is reduced. Recent research estimates a 7 percent increase in patient mortality for every additional patient increase in nurse workload. Decreasing workload should reduce patient mortality by a similar amount.

Another area of workflow technology that can increase patient safety is process mining. Process mining is similar, by analogy, to data mining, but the patterns it extracts from time stamped data are workflow models. These “process maps” are evidence-based representations of what really happens during use of an EHR or health IT system. Process maps can be quite different, and more eye opening, than process maps generated by asking participants questions about their workflows. Process maps can show what happens that shouldn’t, what doesn’t happen than should, and time-delays due to workflow bottlenecks. They are ideal tools to understand what happened during analysis of what may have caused a possibly system-precipitated medical error.

Yet another area of particular relevance of workflow tech to patient safety is the fascinating relationship between clinical pathways, guidelines, etc. and workflow and process definitions executed by workflow tech’s workflow engines. Clinical decision support, bringing the best, evidence-based medical knowledge to the point-of-care, must be seamless with clinical workflow. Otherwise, alert fatigue greatly reduces realization of the potential.

There’s considerable research into how to leverage and combine representations of clinical knowledge with clinical workflow. However, you really need a workflow system to take advantage of this intricate relationship. Hardcoded, workflow-oblivious systems? There’s no way to tweak alerts to workflow context: the who, what, why, when, where, and how of what the clinical is doing. Clinical decision support will not achieve wide spread success and acceptance until it can be intelligently customized and managed, during real-time clinical workflow execution. This, again, requires workflow tech at the point-of-care.

I’ve saved workflow tech’s most important contribution to patient safety until last: Interruptions.

An interruption–is there anything more dreaded than, just when you are beginning to experience optimal mental flow, a higher priority task interrupts your concentration. This is ironic, since so much of work-a-day ambulatory medicine is essentially interrupt-driven (to borrow from computer terminology). Unexpected higher priority tasks and emergencies *should* interrupt lower priority scheduled tasks. Though at the end of the day, ideally, you’ve accomplished all your tasks.

In one research study, over 50% of all healthcare errors were due to slips and lapses, such as not executing an intended action. In other words, good clinical intentions derailed by interruptions.

Workflow management systems provide environmental cues to remind clinical staff to resume interrupted tasks. They represent “stacks” of tasks so the entire care team works together to make sure that interrupted tasks are eventually and appropriately resumed. Workflow management technology can bring to clinical care many of the innovations we admire in the aviation domain, including well-defined steps, checklists, and workflow tools.

Stay tuned for my fifth, and final, guest blog post, in which I tackle Population Health Management with Business Process Management.


Usable EHR Workflow Is Natural, Consistent, Relevant, Supportive and Flexible

Posted on June 11, 2014 I Written By

Chuck Webster, MD, MSIE, MSIS has degrees in Accountancy, Industrial Engineering, Intelligent Systems, and Medicine (from the University of Chicago). He designed the first undergraduate program in medical informatics, was a software architect in a hospital MIS department, and also VP and CMIO for an EHR vendor for over a decade. Dr. Webster helped three healthcare organizations win the HIMSS Davies Award and is a judge for the annual Workflow Management Coalition Awards for Excellence in BPM and Workflow and Awards for Case Management. Chuck is a ceaseless evangelist for process-aware technologies in healthcare, including workflow management systems, Business Process Management, and dynamic and adaptive case management. Dr. Webster tweets from @wareFLO and maintains numerous websites, including EHR Workflow Management Systems (http://chuckwebster.com), Healthcare Business Process Management (http://HCBPM.com) and the People and Organizations improving Healthcare with Health Information Technology (http://EHRworkflow.com). Please join with Chuck to spread the message: Viva la workflow!

This is my third of five guest blog posts covering Health IT and EHR Workflow.

Workflow technology has a reputation, fortunately out of date, for trying to get rid of humans all together. Early on it was used for Straight-Through-Processing in which human stockbrokers were bypassed so stock trades happened in seconds instead of days. Business Process Management (BPM) can still do this. It can automate the logic and workflow that’d normally require a human to download something, check on a value and based on that value do something else useful, such as putting an item in a To-Do list. By automating low-level routine workflows, humans are freed to do more useful things that even workflow automation can’t automate.

But much of healthcare workflow requires human intervention. It is here that modern workflow technology really shines, by becoming an intelligent assistant proactively cooperating with human users to make their jobs easier. A decade ago, at MedInfo04 in San Francisco, I listed the five workflow usability principles that beg for workflow tech at the point-of-care.

Consider these major dimensions of workflow usability: naturalness, consistency, relevance, supportiveness, and flexibility. Workflow management concepts provide a useful bridge from usability concepts applied to single users to usability applied to users in teams. Each concept, realized correctly, contributes to shorter cycle time (encounter length) and increased throughput (patient volume).

Naturalness is the degree to which an application’s behavior matches task structure. In the case of workflow management, multiple task structures stretch across multiple EHR users in multiple roles. A patient visit to a medical practice office involves multiple interactions among patients, nurses, technicians, and physicians. Task analysis must therefore span all of these users and roles. Creation of a patient encounter process definition is an example of this kind of task analysis, and results in a machine executable (by the BPM workflow engine) representation of task structure.

Consistency is the degree to which an application reinforces and relies on user expectations. Process definitions enforce (and therefore reinforce) consistency of EHR user interactions with each other with respect to task goals and context. Over time, team members rely on this consistency to achieve highly automated and interleaved behavior. Consistent repetition leads to increased speed and accuracy.

Relevance is the degree to which extraneous input and output, which may confuse a user, is eliminated. Too much information can be as bad as not enough. Here, process definitions rely on EHR user roles (related sets of activities, responsibilities, and skills) to select appropriate screens, screen contents, and interaction behavior.

Supportiveness is the degree to which enough information is provided to a user to accomplish tasks. An application can support users by contributing to the shared mental model of system state that allows users to coordinate their activities with respect to each other. For example, since a EMR  workflow system represents and updates task status and responsibility in real time, this data can drive a display that gives all EHR users the big picture of who is waiting for what, for how long, and who is responsible.

Flexibility is the degree to which an application can accommodate user requirements, competencies, and preferences. This obviously relates back to each of the previous usability principles. Unnatural, inconsistent, irrelevant, and unsupportive behaviors (from the perspective of a specific user, task, and context) need to be flexibly changed to become natural, consistent, relevant, and supportive. Plus, different EHR users may require different BPM process definitions, or shared process definitions that can be parameterized to behave differently in different user task-contexts.

The ideal EHR/EMR should make the simple easy and fast, and the complex possible and practical. Then ,the majority/minority rule applies. A majority of the time processing is simple, easy, and fast (generating the greatest output for the least input, thereby greatly increasing productivity). In the remaining minority of the time, the productivity increase may be less, but at least there are no showstoppers.

So, to summarize my five principles of workflow usability…

Workflow tech can more naturally match the task structure of a physician’s office through execution of workflow definitions. It can more consistently reinforce user expectations. Over time this leads to highly automated and interleaved team behavior. On a screen-by-screen basis, users encounter more relevant data and order entry options. Workflow tech can track pending tasks–which patients are waiting where, how long, for what, and who is responsible–and this data can be used to support a continually updated shared mental model among users. Finally, to the degree to which an EHR or health IT system is not natural, consistent, relevant, and supportive, the underlying flexibility of the workflow engine and process definitions can be used to mold workflow system behavior until it becomes natural, consistent, relevant, and supportive.

Tomorrow I’ll discuss workflow technology and patient safety.


Interoperable Health IT and Business Process Management: The Spider In The Web

Posted on June 10, 2014 I Written By

Chuck Webster, MD, MSIE, MSIS has degrees in Accountancy, Industrial Engineering, Intelligent Systems, and Medicine (from the University of Chicago). He designed the first undergraduate program in medical informatics, was a software architect in a hospital MIS department, and also VP and CMIO for an EHR vendor for over a decade. Dr. Webster helped three healthcare organizations win the HIMSS Davies Award and is a judge for the annual Workflow Management Coalition Awards for Excellence in BPM and Workflow and Awards for Case Management. Chuck is a ceaseless evangelist for process-aware technologies in healthcare, including workflow management systems, Business Process Management, and dynamic and adaptive case management. Dr. Webster tweets from @wareFLO and maintains numerous websites, including EHR Workflow Management Systems (http://chuckwebster.com), Healthcare Business Process Management (http://HCBPM.com) and the People and Organizations improving Healthcare with Health Information Technology (http://EHRworkflow.com). Please join with Chuck to spread the message: Viva la workflow!

This is my second of five guest blog posts covering Health IT and EHR Workflow.

If you pay any attention at all to interoperability discussion in healthcare and health IT, I’m sure you’ve heard of syntactic vs. semantic interoperability. Syntax and semantics are ideas from linguistics. Syntax is the structure of a message. Semantics is its meaning. Think HL7’s pipes and hats (the characters “|” and “^” used as separators) vs. codes referring to drugs and lab results (the stuff between pipes and hats). What you hardly every hear about is pragmatic interoperability, sometimes called workflow interoperability. We need not just syntactic and semantic interop, but pragmatic workflow interop too. In fact, interoperability based on workflow technology can strategically compensate for deficiencies in syntactic and semantic interoperability. By workflow technology, I mean Business Process Management (BPM).

Why do I highlight BPM’s relevance to health information interoperability? Take a look at this quote from Business Process Management: A Comprehensive Survey:

“WFM/BPM systems are often the “spider in the web” connecting different technologies. For example, the BPM system invokes applications to execute particular tasks, stores process-related information in a database, and integrates different legacy and web-based systems…. Business processes need to be executed in a partly uncontrollable environment where people and organizations may deviate and software components and communication infrastructures may malfunction. Therefore, the BPM system needs to be able to deal with failures and missing data.”

“Partly uncontrollable environment where people and organizations may deviate and software components and communication infrastructures may malfunction”? Sound familiar? That’s right. It should sound a lot like health IT.

What’s the solution? A “spider in the web” connecting different technologies… invoking applications to execute particular tasks, storing process-related information in a database, and integrates different legacy and web-based systems. Dealing with failures and missing data. Yes, healthcare needs a spider in the complicated web of complicate information systems that is today’s health information management infrastructure. Business process management is that spider in a technological web.

Let me show you now how BPM makes pragmatic interoperability possible.

I’ll start with another quote:

“Pragmatic interoperability (PI) is the compatibility between the intended versus the actual effect of message exchange.”

That’s a surprisingly simple definition for what you may have feared would be a tediously arcane topic. Pragmatic interoperability is simply whether the message you send achieves the goal you intended. That’s why it’s “pragmatic” interoperability. Linguistics pragmatics is the study of how we use language to achieve goals.

“Pragmatic interoperability is concerned with ensuring that the exchanged messages cause their intended effect. Often, the intended effect is achieved by sending and receiving multiple messages in specific order, defined in an interaction protocol.”

So, how does workflow technology tie into pragmatic interoperability? The key phrases linking workflow and pragmatics are “intended effect” and “specific order”.

A sequence of actions and messages — send a request to a specialist, track request status, ask about request status, receive result and do the right thing with it — that’s the “specific order” of conversation required to ensure the “intended effect” (the result). Interactions among EHR workflow systems, explicitly defined internal and cross-EHR workflows, hierarchies of automated and human handlers, and rules and schedules for escalation and expiration are necessary to achieve seamless coordination among EHR workflow systems. In other words, we need workflow management system technology to enable self-repairing conversations among EHR and other health IT systems. This is pragmatic interoperability. By the way, some early workflow systems were explicitly based on speech act theory, an area of pragmatics.

That’s my call to use workflow technology, especially Business Process Management, to help solve our healthcare information interoperability problems. Syntactic and semantic interoperability aren’t enough. Cool looking “marketectures” dissecting healthcare interoperability issues aren’t enough. Even APIs (Application Programming Interfaces) aren’t enough. Something has to combine all this stuff, in a scalable and flexible ways (by which I mean, not “hardcoded”) into usable workflows.

Which brings me to usability, tomorrow’s guest blog post topic.

Tune in!