Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

What Do You Think Of Data Lakes?

Posted on October 4, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Being that I am not a high-end technologist, I’m not always up on the latest trends in database management – so the following may not be news to everyone who reads this. As for me, though, the notion of a “data lake” is a new one, and I think it a valuable idea which could hold a lot of promise for managing unruly healthcare data.

The following is a definition of the term appearing on a site called KDnuggets which focuses on data mining, analytics, big data and data science:

A data lake is a storage repository that holds a vast amount of raw data in its native format, including structured, semi-structured and unstructured data. The data structure and requirements are not defined until the data is needed.

According to article author Tamara Dull, while a data warehouse contains data which is structured and processed, expensive to store, relies on a fixed configuration and used by business professionals, a data link contains everything from raw to structured data, is designed for low-cost storage (made possible largely because it relies on open source software Hadoop which can be installed on cheaper commodity hardware), can be configured and reconfigured as needed and is typically used by data scientists. It’s no secret where she comes down as to which model is more exciting.

Perhaps the only downside she identifies as an issue with data lakes is that security may still be a concern, at least when compared to data warehouses. “Data warehouse technologies have been around for decades,” Dull notes. “Thus, the ability to secure data in a data warehouse is much more mature than securing data in a data lake.” But this issue is likely to receive in the near future, as the big data industry is focused tightly on security of late, and to her it’s not a question of if security will mature but when.

It doesn’t take much to envision how the data lake model might benefit healthcare organizations. After all, it may make sense to collect data for which we don’t yet have a well-developed idea of its use. Wearables data comes to mind, as does video from telemedicine consults, but there are probably many other examples you could supply.

On the other hand, one could always counter that there’s not much value in storing data for which you don’t have an immediate use, and which isn’t structured for handy analysis by business analysts on the fly. So even if data lake technology is less costly than data warehousing, it may or may not be worth the investment.

For what it’s worth, I’d come down on the side of the data-lake boosters. Given the growing volume of heterogenous data being generated by healthcare organizations, it’s worth asking whether deploying a healthcare data lake makes sense. With a data lake in place, healthcare leaders can at least catalog and store large volumes of un-normalized data, and that’s probably a good thing. After all, it seems inevitable that we will have to wring value out of such data at some point.

The Value of Machine Learning in Value-based Care

Posted on August 4, 2016 I Written By

The following is a guest blog post by Mary Hardy, Vice President of Healthcare for Ayasdi.

Variation is a natural element in most healthcare delivery. After all, every patient is unique. But unwarranted clinical variation—the kind that results from a lack of systems and collaboration or the inappropriate use of care and services—is another issue altogether.

Healthcare industry thought leaders have called for the reduction of such unwarranted variation as the key to improving the quality and decreasing the cost of care. They have declared, quite rightly, that the quality of care an individual receives should not depend on geography. In response, hospitals throughout the United States are taking on the significant challenge of understanding and managing this variation.

Most hospitals recognize that the ability to distill the right insights from patient data is the catalyst for eliminating unwarranted clinical variation and is essential to implementing care models based on value. However, the complexity of patient data—a complexity that will only increase with the impending onslaught of data from biometric and personal fitness devices—can be overwhelming to even the most advanced organizations. There aren’t enough data scientists or analysts to make sense of the exponentially growing data sets within each organization.

Enter machine learning. Machine learning applications combine algorithms from computational biology and other disciplines to find patterns within billions of data points. The power of these algorithms enables organizations to uncover the evidence-based insights required for success in the value-based care environment.

Machine Learning and the Evolutionary Leap in Clinical Pathway Development
Since the 1990s, provider organizations have attempted to curb unwarranted variation by developing clinical pathways. A multi-disciplinary team of providers use peer-reviewed literature and patient population data to develop and validate best-practice protocols and guidance for specific conditions, treatments, and outcomes.

However, the process is burdened by significant limitations. Pathways often require months or years to research, build, and validate. Additionally, today’s clinical pathways are typically one-size-fits-all. Health systems that have the resources to do so often employ their own experts, who review research, pull data, run tables and come to a consensus on the ideal clinical pathway, but are still constrained by the experts’ inability to make sense of billions of data points.

Additionally, once the clinical pathway has been established, hospitals have few resources for tracking the care team’s adherence to the agreed-upon protocol. This alone is enough to derail years of efforts to reduce unwarranted variation.

Machine learning is the evolutionary leap in clinical pathway development and adherence. Acceleration is certainly a positive. High-performance machines and algorithms can examine complex continuously growing data elements far faster and capture insights more comprehensively than traditional or homegrown analytics tools. (Imagine reducing the development of a clinical pathway from months or years to weeks or days.)

But the true value of machine learning is enabling provider organizations to leverage patient population data from their own systems of record to develop clinical pathways that are customized to the organization’s processes, demographics, and clinicians.

Additionally, machine learning applications empower organizations to precisely track care team adherence, improving communication and organization effectiveness. By guiding clinicians to follow best practices through each step of care delivery, clinical pathways that are rooted in machine learning ensure that all patients receive the same level of high-quality care at the lowest possible cost.

Machine Learning Proves its Value
St. Louis-based Mercy, one of the most innovative health systems in the world, used a machine-learning application to recreate and improve upon a clinical pathway for total knee replacement surgery.

Drawing from Mercy’s integrated electronic medical record (EMR), the application grouped data from a highly complex series of events related to the procedure and segmented it. It was then possible to adapt other methods from biology and signals processing to the problem of determining the optimal way to perform the procedure—which drugs, tests, implants and other processes contribute to that optimal outcome. It also was possible to link predictive machine learning methods like regression or classification to perform real-time pathway editing.

The application revealed that Mercy’s patients naturally divided into clusters or groups with similar outcomes. The primary metric of interest to Mercy as an indicator of high quality was length of stay (LOS). The system highlighted clusters of patients with the shortest LOS and quickly discerned what distinguished this cluster from patients with the longest LOS.

What this analysis revealed was an unforeseen and groundbreaking care pathway for high-quality total knee replacement. The common denominator between all patients with the shortest LOS and best outcomes was administration of pregabalin—a drug generally prescribed for shingles. A group of four physicians had seen something in the medical literature that led them to believe that administering the drug prior to surgery would inhibit postoperative pain, reduce opiate usage and produce faster ambulation. It did.

This innovation was happening in Mercy’s own backyard, and it was undeniably a best practice—the data revealed that each of the best outcomes included administration of this drug. Using traditional approaches, it is highly unlikely that Mercy would have asked the question, “What if we use a shingles drug to improve total knee replacement?” The superior outcomes of four physicians would have remained hidden in a sea of complex data.

This single procedure was worth over $1 million per year for Mercy in direct costs.

What Mercy’s experience demonstrates is that the most difficult, persistent and complex problems in healthcare can resolve themselves through data. The key lies in having the right tools to navigate that data’s complexity. The ability to determine at a glance what differentiates good outcomes from bad outcomes is incredibly powerful—and will transform care delivery.

Mary Hardy is the Vice President of Healthcare for Ayasdi, a developer of machine intelligent applications for health systems and payer organizations.

Applying Geospatial Analysis to Population Health

Posted on June 28, 2016 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

This post is sponsored by Samsung Business. All thoughts and opinions are my own.

Megan Williams wrote a very interesting piece called “Geospatial Analysis: The Next Era of Population Health” in which she highlighted Kaiser’s efforts to use geospatial analysis as part of their population health efforts. Here’s her description of their project:

This means using data to inform policy adjustments and create intervention programs that lead to meaningful change. One of the best examples of this lies with healthcare giant Kaiser Permanente. In April, they launched a database that gave researchers the ability to examine patient DNA and bump it against behavioral and environmental health factors. The goal of the project is to pull information from half a million patients and use it to build one of the most “diverse repositories of environmental, genetic and health data in the world,” which could then be used to inform research around conditions including diabetes and cancer and their relationships to issues including localized violence, pollution, access to quality food and other factors.

This type of effort from Kaiser is quite incredible and I believe will truly be part of the way we shift the cost curve on healthcare costs. One challenge to this effort is that Kaiser has a very different business model than the rest of the healthcare system. They’re in a unique position where their business benefits from these types of population health efforts. Plus, Kaiser is very geographically oriented.

While Kaiser’s business model is currently very different, one could argue that the rest of healthcare is moving towards the Kaiser model. The shift to value based care and accountable care organizations is going to require the same geospatial analysis that Kaiser is building out today. Plus, hospital consolidation is providing real geographic dominance that wasn’t previously available. Will these shifting reimbursement models motivate all of the healthcare systems to care about the 99% of time patients spend outside of our care? I think they will and large healthcare organizations won’t have any choice in the matter.

There are a number of publicly and privately available data stores that are going to help in the geospatial analysis of a population’s health, but I don’t believe that’s going to be enough. In order to discover the real golden insights into a population, we’re going to have to look at the crossroads of data stores (behavioral, environmental, genomic, etc) combined together with personal health data. Some of that personal health data will come from things like EHR software, but I believe that the most powerful geospatial personal health data is going to come from an individual’s cell phone.

This isn’t a hard vision to see. Most of us now carry around a cell phone that knows a lot more about our health than we realize. Plus, it has a GPS where all of those actions can be plotted geospatially. Combine this personally collected health data with these large data stores and we’re likely to get a dramatically different understanding of your health.

While this is an exciting area of healthcare, I think we’d be wise to take a lesson from “big data” in healthcare. Far too many health systems spent millions of dollars building up these massive data warehouses of enterprise health data. Once they were built, they had no idea how to get value from them. Since then, we’ve seen a shift to “skinny data” as one vendor called it. Minimum viable data sets with specific action items tied to that data.

We should likely do the same with geospatial data and population health and focus on the minimum set of data that will provide actual results. We should start with the skinny data that delivers an improvement in health. Over time, those skinny data sets will combine into a population health platform that truly leverages big data in healthcare.

Where do you see geospatial data being used in healthcare? Where would you like to see it being used? What are the untapped opportunities that are waiting for us?

For more content like this, follow Samsung on Insights, Twitter, LinkedIn , YouTube and SlideShare.

Time To Leverage EHR Data Analytics

Posted on May 5, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

For many healthcare organizations, implementing an EHR has been one of the largest IT projects they’ve ever undertaken. And during that implementation, most have decided to focus on meeting Meaningful Use requirements, while keeping their projects on time and on budget.

But it’s not good to stay in emergency mode forever. So at least for providers that have finished the bulk of their initial implementation, it may be time to pay attention to issues that were left behind in the rush to complete the EHR rollout.

According to a recent report by PricewaterhouseCoopers’ Advanced Risk & Compliance Analytics practice, it’s time for healthcare organizations to focus on a new set of EHR data analytics approaches. PwC argues that there is significant opportunity to boost the value of EHR implementations by using advanced analytics for pre-live testing and post-live monitoring. Steps it suggests include the following:

  • Go beyond sample testing: While typical EHR implementation testing strategies look at the underlying systems build and all records, that may not be enough, as build efforts may remain incomplete. Also, end-user workflow specific testing may be occurring simultaneously. Consider using new data mining, visualization analytics tools to conduct more thorough tests and spot trends.
  • Conduct real-time surveillance: Use data analytics programs to review upstream and downstream EHR workflows to find gaps, inefficiencies and other issues. This allows providers to design analytic programs using existing technology architecture.
  • Find RCM inefficiencies: Rather than relying on static EHR revenue cycle reports, which make it hard to identify root causes of trends and concerns, conduct interactive assessment of RCM issues. By creating dashboards with drill-down capabilities, providers can increase collections by scoring patients invoices, prioritizing patient invoices with the highest scores and calculating the bottom-line impact of missing payments.
  • Build a continuously-monitored compliance program: Use a risk-based approach to data sampling and drill-down testing. Analytics tools can allow providers to review multiple data sources under one dashboard identify high-risk patterns in critical areas such as billing.

It’s worth noting, at this point, that while these goals seem worthy, only a small percentage of providers have the resources to create and manage such programs. Sure, vendors will probably tell you that they can pop a solution in place that will get all the work done, but that’s seldom the case in reality. Not only that, a surprising number of providers are still unhappy with their existing EHR, and are now living in replacing those systems despite the cost. So we’re hardly at the “stop and take a breath” stage in most cases.

That being said, it’s certainly time for providers to get out of whatever defensive crouch they’ve been in and get proactive. For example, it certainly would be great to leverage EHRs as tools for revenue cycle enhancement, rather than the absolute revenue drain they’ve been in the past. PwC’s suggestions certainly offer a useful look on where to go from here. That is, if providers’ efforts don’t get hijacked by MACRA.

Healthcare Data Quality and The Complexity of Healthcare Analytics

Posted on March 2, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

The other day I had a really great chat with Khaled El Emam, PhD, CEO and Founder of Privacy Analytics. We had a wide ranging discussion about healthcare data analytics and healthcare data privacy. These are two of the most important topics in the healthcare industry right now and no doubt will be extremely important topics at healthcare conferences happening all through the year.

In our discussion, Khaled talked about what I think are the three most important challenges with healthcare data:

  1. Data Integrity
  2. Data Security
  3. Data Quality

I thought this was a most fantastic way to frame the discussion around data and I think healthcare is lacking in all 3 areas. If we don’t get our heads around all 3 pillars of good data, we’ll never realize the benefits associated with healthcare data.

Khaled also commented to me that 80% of healthcare analytics today is simple analytics. That means that only 20% of our current analysis requires complex analytics. I’m sure he was just giving a ballpark number to illustrate the point that we’re still extremely early on in the application of analytics to healthcare.

One side of me says that maybe we’re lacking a bit of ambition when it comes to leveraging the very best analytics to benefit healthcare. However, I also realize that it means that there’s still a lot of low hanging fruit out there that can benefit healthcare with even just simple analytics. Why should we go after the complex analytics when there’s still so much value to healthcare in simple analytics.

All of this is more of a framework for discussion around analytics. I’m sure I’ll be considering every healthcare analytics I see based on the challenges of data integrity, security and quality.

Are You a Healthcare Data Hoarder?

Posted on October 16, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I’m thinking I need to start a new healthcare reality TV show called “Healthcare Data Hoarders.” We’ll go into healthcare institutions (after signing our HIPAA lives away), and take a look through all the data a healthcare organization is storing away.

My guess is that we wouldn’t have to look very far to find some really amazing healthcare data hoarders. The healthcare data hoarding I see happening in comes in two folds: legacy systems and data warehouses.

Legacy Systems – You know the systems I’m talking about. They’re the ones stored under a desk in the back of radiology. The software is no longer being updated. In fact, the software vendor is often not even around anymore. However, for some reason you think you’re going to need the data off that system that’s 30 years old and only one person in your entire organization knows how to access the legacy software. Yes, I realize there are laws that require healthcare organizations to “hoard” data to some extent. However, many of these legacy systems are well past those legal data retention requirements.

Data Warehouses – These come in all shapes and sizes and for this hoarding article let me suggest that an EHR is kind of a data warehouse (yes, I’m using a really broad definition). Much like a physical hoarder, I see a lot of organizations in healthcare that are gathering virtual piles of data for which they have no use and will likely never find a way to use it. Historically, a data warehouse manager’s job is to try and collect, normalize, and aggregate all of the healthcare organizations data into one repository. Yes, the data warehouse manager is really the Chief Healthcare Data Hoarder. Gather and protect and and all data you can find.

While I love the idea that we’re collecting data that can hopefully make healthcare better, just collecting data doesn’t do anything to improve healthcare. In fact, it can often retard efforts to leverage healthcare data to improve health. The problem is that the healthcare data that can be leveraged for good is buried under all of this useless data. It takes so much effort to sift through the junk data that people just stop before they even get started.

Are you collecting data and not doing anything with it? I challenge you to remedy that situation.

Is your healthcare organization a healthcare data hoarder?

Population Health Management and Business Process Management

Posted on June 13, 2014 I Written By

Chuck Webster, MD, MSIE, MSIS has degrees in Accountancy, Industrial Engineering, Intelligent Systems, and Medicine (from the University of Chicago). He designed the first undergraduate program in medical informatics, was a software architect in a hospital MIS department, and also VP and CMIO for an EHR vendor for over a decade. Dr. Webster helped three healthcare organizations win the HIMSS Davies Award and is a judge for the annual Workflow Management Coalition Awards for Excellence in BPM and Workflow and Awards for Case Management. Chuck is a ceaseless evangelist for process-aware technologies in healthcare, including workflow management systems, Business Process Management, and dynamic and adaptive case management. Dr. Webster tweets from @wareFLO and maintains numerous websites, including EHR Workflow Management Systems (http://chuckwebster.com), Healthcare Business Process Management (http://HCBPM.com) and the People and Organizations improving Healthcare with Health Information Technology (http://EHRworkflow.com). Please join with Chuck to spread the message: Viva la workflow!

This is my fifth and final of five guest blog posts covering Health IT and EHR Workflow.

Way back in 2009 I penned a research paper with a long and complicated title that could also have been, simply, Population Health Management and Business Process Management. In 2010 I presented it at MedInfo10 in Cape Town, Africa. Check out my travelogue!

Since then, some of what I wrote has become reality, and much of the rest is on the way. Before I dive into the weeds, let me set the stage. The Affordable Care Act added tens of millions of new patients to an already creaky and dysfunctional healthcare and health IT system. Accountable Care Organizations were conceived as virtual enterprises to be paid to manage the clinical outcome and costs of care of specific populations of individuals. Population Health Management has become the dominant conceptual framework for proceeding.

I looked at a bunch of definitions of population health management and created the following as a synthesis: “Proactive management of clinical and financial risks of a defined patient group to improve clinical outcomes and reduce cost via targeted, coordinated engagement of providers and patients across all care settings.”

You can see obvious places in this definition to apply trendy SMAC tech — social, mobile, analytics, and cloud — social, patient settings; mobile, provider and patient settings; analytics, cost and outcomes; cloud, across settings. But here I want to focus on the “targeted, coordinated.” Increasingly, it is self-developed and vendor-supplied care coordination platforms that target and coordinate, filling a gap between EHRs and day-to-day provider and patient workflows.

The best technology on which, from which, to create care coordination platforms is workflow technology, AKA business process management and adaptive/dynamic case management software. In fact, when I drill down on most sophisticated, scalable population health management and care coordination solutions, I usually find a combination of a couple things. Either the health IT organization or vendor is, in essence, reinventing the workflow tech wheel, or they embed or build on third-party BPM technology.

Let me direct you to my section Patient Class Event Hierarchy Intermediates Patient Event Stream and Automated Workflow in that MedInfo10 paper. First of all you have to target the right patients for intervention. Increasingly, ideas from Complex Event Processing are used to quickly and appropriately react to patient events. A Patient Class Event Hierarchy is a decision tree mediating between low-level events (patient state changes) and higher-level concepts clinical concepts such as “on-protocol,” “compliant”, “measured”, and “controlled.”

Examples include patients who aren’t on protocol but should be, aren’t being measured but should be, or whose clinical values are not controlled. Execution of appropriate automatic policy-based workflows (in effect, intervention plans) moves patients from off-protocol to on-protocol, non-compliance to compliance, unmeasured to measured, and from uncontrolled to controlled state categories.

Population health management and care coordination products and services may use different categories, terminology, etc. But they all tend to focus on sensing and reacting to untoward changes in patient state. But simply detecting these changes is insufficient. These systems need to cause actions. And these actions need to be monitored, managed, and improved, all of which are classic sterling qualities of business process management software systems and suites.

I’m reminded of several tweets about Accountable Care Organization IT systems I display during presentations. One summarizes an article about ACOs. The other paraphrases an ACO expert speaking at a conference. The former says ACOs must tie together many disparate IT systems. The later says ACOs boil down to lists: actionable lists of items delivered to the right person at the right time. If you put these requirements together with system-wide care pathways delivered safely and conveniently to the point of care, you get my three previous blog posts on interoperability, usability, and safety.

I’ll close here with my seven advantages of BPM-based care coordination technology. It…

  • More granularly distinguishes workflow steps
  • Captures more meaningful time-stamped task data
  • More actively influences point-of-care workflow
  • Helps model and understand workflow
  • Better coordinates patient care task handoffs
  • Monitors patient care task execution in real-time
  • Systematically improves workflow effectiveness & efficiency

Distinguishing among workflow steps is important to collecting data about which steps provide value to providers and patients, as well as time-stamps necessary to estimate true costs. Further, since these steps are executed, or at least monitored, at the point-of-care, there’s more opportunity to facilitate and influence at the point-of-care. Modeling workflow contributes to understanding workflow, in my view an intrinsically valuable state of affairs. These workflow models can represent and compensate for interruptions to necessary care task handoffs. During workflow execution, “enactment” in BPM parlance, workflow state is made transparently visible. Finally, workflow data “exhaust” (particularly times-stamped evidence-based process maps) can be used to systematically find bottlenecks and plug care gaps.

In light of the fit between complex event processing detecting changes in patient state, and BPM’s automated, managed workflow at the point-of-care, I see no alternative to what I predicted in 2010. Regardless of whether it’s rebranded as care or healthcare process management, business process management is the most mature, practical, and scalable way to create the care coordination and population health management IT systems required by Accountable Care Organizations and the Affordable Care Act. A bit dramatically, I’d even say business process management’s royal road to healthcare runs through care coordination.

This was my fifth and final blog post in this series on healthcare and workflow technology solicited by John Lynn for this week that he’s on vacation. Here was the outline:

If you missed one of my previous posts, I hope you’ll still check it out. Finally, thank you John, for allowing to me temporarily share your bully pulpit.


Patient Safety And Process-Aware Information Systems: Interruptions, Interruptions, Interruptions!

Posted on June 12, 2014 I Written By

Chuck Webster, MD, MSIE, MSIS has degrees in Accountancy, Industrial Engineering, Intelligent Systems, and Medicine (from the University of Chicago). He designed the first undergraduate program in medical informatics, was a software architect in a hospital MIS department, and also VP and CMIO for an EHR vendor for over a decade. Dr. Webster helped three healthcare organizations win the HIMSS Davies Award and is a judge for the annual Workflow Management Coalition Awards for Excellence in BPM and Workflow and Awards for Case Management. Chuck is a ceaseless evangelist for process-aware technologies in healthcare, including workflow management systems, Business Process Management, and dynamic and adaptive case management. Dr. Webster tweets from @wareFLO and maintains numerous websites, including EHR Workflow Management Systems (http://chuckwebster.com), Healthcare Business Process Management (http://HCBPM.com) and the People and Organizations improving Healthcare with Health Information Technology (http://EHRworkflow.com). Please join with Chuck to spread the message: Viva la workflow!

This is my fourth of five guest blog posts covering Health IT and EHR Workflow.

When you took a drivers education class, do you remember the importance of mental “awareness” to traffic safety? Continually monitor your environment, your car, and yourself. As in traffic flow, healthcare is full of work flow, and awareness of workflow is the key to patient safety.

First of all, the very act of creating a model of work to be done forces designers and users to very carefully think about and work through workflow “happy paths” and what to do when they’re fallen off. A happy path is a sequence of events that’s intended to happen, and, if all goes well, actually does happen most of the time. Departures from the Happy Path are called “exceptions” in computer programming parlance. Exceptions are “thrown”, “caught”, and “handled.” At the level of computer programming, an exception may occur when data is requested from a network resource, but the network is down. At the level of workflow, an exception might be a patient no-show, an abnormal lab value, or suddenly being called away by an emergency or higher priority circumstance.

Developing a model of work, variously called workflow/process definition or work plan forces workflow designers and workflow users to communicate at a level of abstraction that is much more natural and productive than either computer code or screen mockups.

Once a workflow model is created, it can be automatically analyzed for completeness and consistency. Similar to how a compiler can detect problems in code before it’s released, problems in workflow can be prevented. This sort of formal analysis is in its infancy, and is perhaps most advanced in healthcare in the design of medical devices.

When workflow engines execute models of work, work is performed. If this work would have otherwise necessarily been accomplished by humans, user workload is reduced. Recent research estimates a 7 percent increase in patient mortality for every additional patient increase in nurse workload. Decreasing workload should reduce patient mortality by a similar amount.

Another area of workflow technology that can increase patient safety is process mining. Process mining is similar, by analogy, to data mining, but the patterns it extracts from time stamped data are workflow models. These “process maps” are evidence-based representations of what really happens during use of an EHR or health IT system. Process maps can be quite different, and more eye opening, than process maps generated by asking participants questions about their workflows. Process maps can show what happens that shouldn’t, what doesn’t happen than should, and time-delays due to workflow bottlenecks. They are ideal tools to understand what happened during analysis of what may have caused a possibly system-precipitated medical error.

Yet another area of particular relevance of workflow tech to patient safety is the fascinating relationship between clinical pathways, guidelines, etc. and workflow and process definitions executed by workflow tech’s workflow engines. Clinical decision support, bringing the best, evidence-based medical knowledge to the point-of-care, must be seamless with clinical workflow. Otherwise, alert fatigue greatly reduces realization of the potential.

There’s considerable research into how to leverage and combine representations of clinical knowledge with clinical workflow. However, you really need a workflow system to take advantage of this intricate relationship. Hardcoded, workflow-oblivious systems? There’s no way to tweak alerts to workflow context: the who, what, why, when, where, and how of what the clinical is doing. Clinical decision support will not achieve wide spread success and acceptance until it can be intelligently customized and managed, during real-time clinical workflow execution. This, again, requires workflow tech at the point-of-care.

I’ve saved workflow tech’s most important contribution to patient safety until last: Interruptions.

An interruption–is there anything more dreaded than, just when you are beginning to experience optimal mental flow, a higher priority task interrupts your concentration. This is ironic, since so much of work-a-day ambulatory medicine is essentially interrupt-driven (to borrow from computer terminology). Unexpected higher priority tasks and emergencies *should* interrupt lower priority scheduled tasks. Though at the end of the day, ideally, you’ve accomplished all your tasks.

In one research study, over 50% of all healthcare errors were due to slips and lapses, such as not executing an intended action. In other words, good clinical intentions derailed by interruptions.

Workflow management systems provide environmental cues to remind clinical staff to resume interrupted tasks. They represent “stacks” of tasks so the entire care team works together to make sure that interrupted tasks are eventually and appropriately resumed. Workflow management technology can bring to clinical care many of the innovations we admire in the aviation domain, including well-defined steps, checklists, and workflow tools.

Stay tuned for my fifth, and final, guest blog post, in which I tackle Population Health Management with Business Process Management.


Usable EHR Workflow Is Natural, Consistent, Relevant, Supportive and Flexible

Posted on June 11, 2014 I Written By

Chuck Webster, MD, MSIE, MSIS has degrees in Accountancy, Industrial Engineering, Intelligent Systems, and Medicine (from the University of Chicago). He designed the first undergraduate program in medical informatics, was a software architect in a hospital MIS department, and also VP and CMIO for an EHR vendor for over a decade. Dr. Webster helped three healthcare organizations win the HIMSS Davies Award and is a judge for the annual Workflow Management Coalition Awards for Excellence in BPM and Workflow and Awards for Case Management. Chuck is a ceaseless evangelist for process-aware technologies in healthcare, including workflow management systems, Business Process Management, and dynamic and adaptive case management. Dr. Webster tweets from @wareFLO and maintains numerous websites, including EHR Workflow Management Systems (http://chuckwebster.com), Healthcare Business Process Management (http://HCBPM.com) and the People and Organizations improving Healthcare with Health Information Technology (http://EHRworkflow.com). Please join with Chuck to spread the message: Viva la workflow!

This is my third of five guest blog posts covering Health IT and EHR Workflow.

Workflow technology has a reputation, fortunately out of date, for trying to get rid of humans all together. Early on it was used for Straight-Through-Processing in which human stockbrokers were bypassed so stock trades happened in seconds instead of days. Business Process Management (BPM) can still do this. It can automate the logic and workflow that’d normally require a human to download something, check on a value and based on that value do something else useful, such as putting an item in a To-Do list. By automating low-level routine workflows, humans are freed to do more useful things that even workflow automation can’t automate.

But much of healthcare workflow requires human intervention. It is here that modern workflow technology really shines, by becoming an intelligent assistant proactively cooperating with human users to make their jobs easier. A decade ago, at MedInfo04 in San Francisco, I listed the five workflow usability principles that beg for workflow tech at the point-of-care.

Consider these major dimensions of workflow usability: naturalness, consistency, relevance, supportiveness, and flexibility. Workflow management concepts provide a useful bridge from usability concepts applied to single users to usability applied to users in teams. Each concept, realized correctly, contributes to shorter cycle time (encounter length) and increased throughput (patient volume).

Naturalness is the degree to which an application’s behavior matches task structure. In the case of workflow management, multiple task structures stretch across multiple EHR users in multiple roles. A patient visit to a medical practice office involves multiple interactions among patients, nurses, technicians, and physicians. Task analysis must therefore span all of these users and roles. Creation of a patient encounter process definition is an example of this kind of task analysis, and results in a machine executable (by the BPM workflow engine) representation of task structure.

Consistency is the degree to which an application reinforces and relies on user expectations. Process definitions enforce (and therefore reinforce) consistency of EHR user interactions with each other with respect to task goals and context. Over time, team members rely on this consistency to achieve highly automated and interleaved behavior. Consistent repetition leads to increased speed and accuracy.

Relevance is the degree to which extraneous input and output, which may confuse a user, is eliminated. Too much information can be as bad as not enough. Here, process definitions rely on EHR user roles (related sets of activities, responsibilities, and skills) to select appropriate screens, screen contents, and interaction behavior.

Supportiveness is the degree to which enough information is provided to a user to accomplish tasks. An application can support users by contributing to the shared mental model of system state that allows users to coordinate their activities with respect to each other. For example, since a EMR  workflow system represents and updates task status and responsibility in real time, this data can drive a display that gives all EHR users the big picture of who is waiting for what, for how long, and who is responsible.

Flexibility is the degree to which an application can accommodate user requirements, competencies, and preferences. This obviously relates back to each of the previous usability principles. Unnatural, inconsistent, irrelevant, and unsupportive behaviors (from the perspective of a specific user, task, and context) need to be flexibly changed to become natural, consistent, relevant, and supportive. Plus, different EHR users may require different BPM process definitions, or shared process definitions that can be parameterized to behave differently in different user task-contexts.

The ideal EHR/EMR should make the simple easy and fast, and the complex possible and practical. Then ,the majority/minority rule applies. A majority of the time processing is simple, easy, and fast (generating the greatest output for the least input, thereby greatly increasing productivity). In the remaining minority of the time, the productivity increase may be less, but at least there are no showstoppers.

So, to summarize my five principles of workflow usability…

Workflow tech can more naturally match the task structure of a physician’s office through execution of workflow definitions. It can more consistently reinforce user expectations. Over time this leads to highly automated and interleaved team behavior. On a screen-by-screen basis, users encounter more relevant data and order entry options. Workflow tech can track pending tasks–which patients are waiting where, how long, for what, and who is responsible–and this data can be used to support a continually updated shared mental model among users. Finally, to the degree to which an EHR or health IT system is not natural, consistent, relevant, and supportive, the underlying flexibility of the workflow engine and process definitions can be used to mold workflow system behavior until it becomes natural, consistent, relevant, and supportive.

Tomorrow I’ll discuss workflow technology and patient safety.


Interoperable Health IT and Business Process Management: The Spider In The Web

Posted on June 10, 2014 I Written By

Chuck Webster, MD, MSIE, MSIS has degrees in Accountancy, Industrial Engineering, Intelligent Systems, and Medicine (from the University of Chicago). He designed the first undergraduate program in medical informatics, was a software architect in a hospital MIS department, and also VP and CMIO for an EHR vendor for over a decade. Dr. Webster helped three healthcare organizations win the HIMSS Davies Award and is a judge for the annual Workflow Management Coalition Awards for Excellence in BPM and Workflow and Awards for Case Management. Chuck is a ceaseless evangelist for process-aware technologies in healthcare, including workflow management systems, Business Process Management, and dynamic and adaptive case management. Dr. Webster tweets from @wareFLO and maintains numerous websites, including EHR Workflow Management Systems (http://chuckwebster.com), Healthcare Business Process Management (http://HCBPM.com) and the People and Organizations improving Healthcare with Health Information Technology (http://EHRworkflow.com). Please join with Chuck to spread the message: Viva la workflow!

This is my second of five guest blog posts covering Health IT and EHR Workflow.

If you pay any attention at all to interoperability discussion in healthcare and health IT, I’m sure you’ve heard of syntactic vs. semantic interoperability. Syntax and semantics are ideas from linguistics. Syntax is the structure of a message. Semantics is its meaning. Think HL7’s pipes and hats (the characters “|” and “^” used as separators) vs. codes referring to drugs and lab results (the stuff between pipes and hats). What you hardly every hear about is pragmatic interoperability, sometimes called workflow interoperability. We need not just syntactic and semantic interop, but pragmatic workflow interop too. In fact, interoperability based on workflow technology can strategically compensate for deficiencies in syntactic and semantic interoperability. By workflow technology, I mean Business Process Management (BPM).

Why do I highlight BPM’s relevance to health information interoperability? Take a look at this quote from Business Process Management: A Comprehensive Survey:

“WFM/BPM systems are often the “spider in the web” connecting different technologies. For example, the BPM system invokes applications to execute particular tasks, stores process-related information in a database, and integrates different legacy and web-based systems…. Business processes need to be executed in a partly uncontrollable environment where people and organizations may deviate and software components and communication infrastructures may malfunction. Therefore, the BPM system needs to be able to deal with failures and missing data.”

“Partly uncontrollable environment where people and organizations may deviate and software components and communication infrastructures may malfunction”? Sound familiar? That’s right. It should sound a lot like health IT.

What’s the solution? A “spider in the web” connecting different technologies… invoking applications to execute particular tasks, storing process-related information in a database, and integrates different legacy and web-based systems. Dealing with failures and missing data. Yes, healthcare needs a spider in the complicated web of complicate information systems that is today’s health information management infrastructure. Business process management is that spider in a technological web.

Let me show you now how BPM makes pragmatic interoperability possible.

I’ll start with another quote:

“Pragmatic interoperability (PI) is the compatibility between the intended versus the actual effect of message exchange.”

That’s a surprisingly simple definition for what you may have feared would be a tediously arcane topic. Pragmatic interoperability is simply whether the message you send achieves the goal you intended. That’s why it’s “pragmatic” interoperability. Linguistics pragmatics is the study of how we use language to achieve goals.

“Pragmatic interoperability is concerned with ensuring that the exchanged messages cause their intended effect. Often, the intended effect is achieved by sending and receiving multiple messages in specific order, defined in an interaction protocol.”

So, how does workflow technology tie into pragmatic interoperability? The key phrases linking workflow and pragmatics are “intended effect” and “specific order”.

A sequence of actions and messages — send a request to a specialist, track request status, ask about request status, receive result and do the right thing with it — that’s the “specific order” of conversation required to ensure the “intended effect” (the result). Interactions among EHR workflow systems, explicitly defined internal and cross-EHR workflows, hierarchies of automated and human handlers, and rules and schedules for escalation and expiration are necessary to achieve seamless coordination among EHR workflow systems. In other words, we need workflow management system technology to enable self-repairing conversations among EHR and other health IT systems. This is pragmatic interoperability. By the way, some early workflow systems were explicitly based on speech act theory, an area of pragmatics.

That’s my call to use workflow technology, especially Business Process Management, to help solve our healthcare information interoperability problems. Syntactic and semantic interoperability aren’t enough. Cool looking “marketectures” dissecting healthcare interoperability issues aren’t enough. Even APIs (Application Programming Interfaces) aren’t enough. Something has to combine all this stuff, in a scalable and flexible ways (by which I mean, not “hardcoded”) into usable workflows.

Which brings me to usability, tomorrow’s guest blog post topic.

Tune in!