Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

Eyes Wide Shut – Patient Engagement Pitfalls Prior to Meaningful Use Reporting Period

Written by:

July 1, 2015 – the start of the Meaningful Use Stage 1 Year 2 reporting period for the hospital facilities within this provider integrated delivery network (IDN). The day the 50% online access measure gets real. The day the inpatient summary CCDA MUST be made available online within 36 hours of discharge. The day we must overcome a steady 65% patient portal decline rate.

A quick recap for those who haven’t followed this series (and refresher for those who have): this IDN has multiple hospital facilities, primary care, and specialty practices, on disparate EMRs, all connecting to an HIE and one enterprise patient portal. There are 8 primary EMRs and more than 20 distinct patient identification (MRN) pools. And many entities within this IDN are attempting to attest to Meaningful Use Stage 2 this year.

For the purposes of this post, I’m ignoring CMS and the ONC’s new proposed rule that would, if adopted, allow entities to attest to Meaningful Use Stage 1 OR 2 measures, using 2011 OR 2014 CEHRT (or some combination thereof). Even if the proposed rule were sensible, it came too late for the hospitals which must start their reporting period in the third calendar quarter of 2014 in order to complete before the start of the fiscal year on October 1. For this IDN, the proposed rule isn’t changing anything.

Believe me, I would have welcomed change.

The purpose of the so-called “patient engagement” core measures is just that: engage patients in their healthcare, and liberate the data so that patients are empowered to have meaningful conversations with their providers, and to make informed health decisions. The intent is a good one. The result of releasing the EMR’s compilation of chart data to recently-discharged patients may not be.

I answered the phone on a Saturday, while standing in the middle of a shopping mall with my 12 year-old daughter, to discover a distraught man and one of my help desk representatives on the line. The man’s wife had been recently released from the hospital; they had been provided patient portal access to receive and review her records, and they were bewildered by the information given. The medications listed on the document were not the same as those his wife regularly takes, the lab section did not have any context provided for why the tests were ordered or what the results mean, there were a number of lab results missing that he knew had been performed, and the problems list did not seem to have any correlation to the diagnoses provided for the encounter.

Just the kind of call an IT geek wants to receive.

How do you explain to an 84 year-old man that his wife’s inpatient summary record contains only a snapshot of the information that was captured during that specific hospital encounter, by resources at each point in the patient experience, with widely-varied roles and educational backgrounds, with varied attention to detail, and only a vague awareness of how that information would then be pulled together and presented by technology that was built to meet the bare minimum standards for perfect-world test scenarios required by government mandates?

How do you tell him that the lab results are only what was available at time of discharge, not the pathology reports that had to be sent out for analysis and would not come back in time to meet the 36-hour deadline?

How do you tell him that the reasons there are so many discrepancies between what he sees on the document and what is available on the full chart are data entry errors, new workflow processes that have not yet been widely adopted by each member of the care team, and technical differences between EMRs in the interpretation of the IHE’s XML standards for how these CCDA documents were to be created?

EMR vendors have responded to that last question with, “If you use our tethered portal, you won’t have that problem. Our portal can present the data from our CCDA just fine.” But this doesn’t take into account the patient experience. As a consumer, I ask you: would you use online banking if you had to sign on to a different website, with a different username and password, for each account within the same bank? Why should it be acceptable for managing health information online to be less convenient than managing financial information?

How do hospital clinical and IT staff navigate this increasingly-frequent scenario that is occurring: explaining the data that patients now see?

I’m working hard to establish a clear delineation between answering technical and clinical questions, because I am not – by any stretch of the imagination – a clinician. I can explain deviations in the records presentation, I can explain the data that is and is not available – and why (which is NOT generally well-received), and I can explain the logical processes for patients to get their clinical questions answered.

Solving the other half of this equation – clinicians who understand the technical nuances which have become patient-facing, and who incorporate that knowledge into regular patient engagement to insure patients understand the limitations of their newly-liberated data – proves more challenging. In order to engage patients in the way the CMS Meaningful Use program mandates, have we effectively created a new hybrid role requirement for our healthcare providers?

And what fresh new hell have we created for some patients who seek wisdom from all this information they’ve been given?

Caveat – if you’re reading this, it’s likely you’re not the kind of patient who needs much explaining. You’re likely to do your own research on the data that’s presented on your CCDA outputs, and you have the context of the entire Meaningful Use initiative to understand why information is presented the way it is. But think – can your grandma read it and understand it on HER own?

June 30, 2014 I Written By

Mandi Bishop is a healthcare IT consultant and a hardcore data geek with a Master's in English and a passion for big data analytics, who fell in love with her PCjr at 9 when she learned to program in BASIC. Individual accountability zealot, patient engagement advocate, innovation lover and ceaseless dreamer. Relentless in pursuit of answers to the question: "How do we GET there from here?" More byte-sized commentary on Twitter: @MandiBPro.

Vendor Creates EMR For Google Glass

Written by:

Well, here’s an interesting development. An EMR company has created an app allowing doctors using Google Glass to store patient data on a cloud-based storage and collaboration site.

The vendor, California-based Drchrono, is claiming that the application is the first “wearable health record.”  Whether or not that’s the case, this is clearly a step forward in the development of Google Glass as a practical tool for doctors.

According to a Reuters report, Drchrono worked closely with cloud-based storage and collaboration service Box along with Google Glass to create the app.

The new Google Glass at allows doctors — with the patient’s permission — to use Google Glass to record a consultation or surgery. Once the work is done, physician can store the video, as well as photographs and notes, and the patient’s EMR or in Box. The app also allows the data to  be shared with the patient.

The app is still in its infancy — so far, just 300 of the 60,000 doctors using Drchrono’s EMR platform have opted to use the Google Glass app, which is currently available at no cost to users.

But Google Glass apps and options are clearly on the rise, and not just among providers. A recent study by Accenture found that consumers are are very interested in wearable technology; they’re particularly interested in wearable smart glasses like Google Glass as well as smart watches.

As things stand, devices like Google Glass are in the very early adoption stage, so it’s not surprising that few of Drchrono’s physician users have opted to try out the new app. But things are likely to change over the next year or two.

I believe Google Glass will follow the same trajectory the iPad did in medicine. First it was a toy for the well-financed, curious and tech savvy, then an option for early adopters in medicine, then eventually a tool that made sense for nearly every provider.

For the next year or two, most Google Glass announcements will be like this one, reports of experiments whose only uptake will come from leading-edge experimenters in medical technology. But within the next two years or so, Google Glass uses will proliferate, as will the apps that make them a worthwhile investment.

This level of success isn’t inevitable, but it is likely. I’d bet good money that two years from now, you may be reading this blog on a Google Glass app and managing your EMR through one as well.  It’s just a matter of time.

June 20, 2014 I Written By

Katherine Rourke is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Patient Safety And Process-Aware Information Systems: Interruptions, Interruptions, Interruptions!

Written by:

This is my fourth of five guest blog posts covering Health IT and EHR Workflow.

When you took a drivers education class, do you remember the importance of mental “awareness” to traffic safety? Continually monitor your environment, your car, and yourself. As in traffic flow, healthcare is full of work flow, and awareness of workflow is the key to patient safety.

First of all, the very act of creating a model of work to be done forces designers and users to very carefully think about and work through workflow “happy paths” and what to do when they’re fallen off. A happy path is a sequence of events that’s intended to happen, and, if all goes well, actually does happen most of the time. Departures from the Happy Path are called “exceptions” in computer programming parlance. Exceptions are “thrown”, “caught”, and “handled.” At the level of computer programming, an exception may occur when data is requested from a network resource, but the network is down. At the level of workflow, an exception might be a patient no-show, an abnormal lab value, or suddenly being called away by an emergency or higher priority circumstance.

Developing a model of work, variously called workflow/process definition or work plan forces workflow designers and workflow users to communicate at a level of abstraction that is much more natural and productive than either computer code or screen mockups.

Once a workflow model is created, it can be automatically analyzed for completeness and consistency. Similar to how a compiler can detect problems in code before it’s released, problems in workflow can be prevented. This sort of formal analysis is in its infancy, and is perhaps most advanced in healthcare in the design of medical devices.

When workflow engines execute models of work, work is performed. If this work would have otherwise necessarily been accomplished by humans, user workload is reduced. Recent research estimates a 7 percent increase in patient mortality for every additional patient increase in nurse workload. Decreasing workload should reduce patient mortality by a similar amount.

Another area of workflow technology that can increase patient safety is process mining. Process mining is similar, by analogy, to data mining, but the patterns it extracts from time stamped data are workflow models. These “process maps” are evidence-based representations of what really happens during use of an EHR or health IT system. Process maps can be quite different, and more eye opening, than process maps generated by asking participants questions about their workflows. Process maps can show what happens that shouldn’t, what doesn’t happen than should, and time-delays due to workflow bottlenecks. They are ideal tools to understand what happened during analysis of what may have caused a possibly system-precipitated medical error.

Yet another area of particular relevance of workflow tech to patient safety is the fascinating relationship between clinical pathways, guidelines, etc. and workflow and process definitions executed by workflow tech’s workflow engines. Clinical decision support, bringing the best, evidence-based medical knowledge to the point-of-care, must be seamless with clinical workflow. Otherwise, alert fatigue greatly reduces realization of the potential.

There’s considerable research into how to leverage and combine representations of clinical knowledge with clinical workflow. However, you really need a workflow system to take advantage of this intricate relationship. Hardcoded, workflow-oblivious systems? There’s no way to tweak alerts to workflow context: the who, what, why, when, where, and how of what the clinical is doing. Clinical decision support will not achieve wide spread success and acceptance until it can be intelligently customized and managed, during real-time clinical workflow execution. This, again, requires workflow tech at the point-of-care.

I’ve saved workflow tech’s most important contribution to patient safety until last: Interruptions.

An interruption–is there anything more dreaded than, just when you are beginning to experience optimal mental flow, a higher priority task interrupts your concentration. This is ironic, since so much of work-a-day ambulatory medicine is essentially interrupt-driven (to borrow from computer terminology). Unexpected higher priority tasks and emergencies *should* interrupt lower priority scheduled tasks. Though at the end of the day, ideally, you’ve accomplished all your tasks.

In one research study, over 50% of all healthcare errors were due to slips and lapses, such as not executing an intended action. In other words, good clinical intentions derailed by interruptions.

Workflow management systems provide environmental cues to remind clinical staff to resume interrupted tasks. They represent “stacks” of tasks so the entire care team works together to make sure that interrupted tasks are eventually and appropriately resumed. Workflow management technology can bring to clinical care many of the innovations we admire in the aviation domain, including well-defined steps, checklists, and workflow tools.

Stay tuned for my fifth, and final, guest blog post, in which I tackle Population Health Management with Business Process Management.


June 12, 2014 I Written By

Chuck Webster, MD, MSIE, MSIS has degrees in Accountancy, Industrial Engineering, Intelligent Systems, and Medicine (from the University of Chicago). He designed the first undergraduate program in medical informatics, was a software architect in a hospital MIS department, and also VP and CMIO for an EHR vendor for over a decade. Dr. Webster helped three healthcare organizations win the HIMSS Davies Award and is a judge for the annual Workflow Management Coalition Awards for Excellence in BPM and Workflow and Awards for Case Management. Chuck is a ceaseless evangelist for process-aware technologies in healthcare, including workflow management systems, Business Process Management, and dynamic and adaptive case management. Dr. Webster tweets from @wareFLO and maintains numerous websites, including EHR Workflow Management Systems (http://chuckwebster.com), Healthcare Business Process Management (http://HCBPM.com) and the People and Organizations improving Healthcare with Health Information Technology (http://EHRworkflow.com). Please join with Chuck to spread the message: Viva la workflow!

Big Brother Or Best Friend?

Written by:

The premise of clinical decision support (CDS) is simple and powerful: humans can’t remember everything, so enter data into a computer and let the computer render judgement. So long as the data is accurate and the rules in the computer are valid, the computer will be correct the vast majority of the time.

CDS is commonly implemented in computerized provider order entry (CPOE) systems across most order types – labs, drugs, radiology, and more. A simple example: most pediatric drugs require weight-based dosing. When physicians order drugs for pediatric patients using CPOE, the computer should validate the dose of the drug against the patient’s weight to ensure the dose is in the acceptable range. Given that the computer has all of the information necessary to calculate acceptable dose ranges, and the fact that it’s easy to accidently enter the wrong dose into the computer, CDS at the point of ordering delivers clear benefits.

The general notion of CDS – checking to make sure things are being done correctly – is the same fundamental principle behind checklists. In The Checklist Manifesto, Dr. Atul Gawande successfully argues that the challenge in medicine today is not in ignorance, but in execution. Checklists (whether paper or digital) and CDS are realizations of that reality.

CDS in CPOE works because physicians need to enter orders to do their job. But checklists aren’t as fundamentally necessary for any given procedure or action. The checklist can be skipped, and the provider can perform the procedure at hand. Thus, the fundamental problem with checklists are that they insert a layer of friction into workflows: running through the checklist. If checklists could be implemented seamlessly without introducing any additional workflow friction, they would be more widely adopted and adhered to. The basic problem is that people don’t want to go back to the same repetitive formula for tasks they feel comfortable performing. Given the tradeoff between patient safety and efficiency, checklists have only been seriously discussed in high acuity, high risk settings such as surgery and ICUs. It’s simply not practical to implement checklists for low risk procedures. But even in high acuity environments, many organizations continue to struggle implementing checklists.

So…. what if we could make checklists seamless? How could that even be done?

Looking at CPOE CDS as a foundation, there are two fundamental challenges: collecting data, and checking against rules.

Computers can already access EMRs to retrieve all sorts of information about the patient. But computers don’t yet have any ability to collect data about what providers are and aren’t physically doing at the point of are. Without knowing what’s physically happening, computers can’t present alerts based on skipped or incorrect steps of the checklist. The solution would likely be based on a Kinect-like system that can detect movements and actions. Once the computer knows what’s going on, it can cross reference what’s happening against what’s supposed to happen given the context of care delivery and issue alerts accordingly.

What’s described above is an extremely ambitious technical undertaking. It will take many years to get there. There are already a number of companies trying to addressing this in primitive forms: SwipeSense detects if providers clean their hands before seeing patients, and the CHARM system uses Kinect to detect hand movements and ensure surgeries are performed correctly.

These early examples are a harbinger of what’s to come. If preventable mistakes are the biggest killer within hospitals, hospitals need to implement systems to identify and prevent errors before they happen.

Let’s assume that the tech evolves for an omniscient benevolent computer that detects errors and issues warnings. Although this is clearly desirable for patients, what does this mean for providers? Will they become slaves to the computer? Providers already face challenges with CPOE alert fatigue. Just imagine do-anything alert fatigue.

There is an art to telling people that they’re wrong. In order to successfully prevent errors, computers will need to learn that art. Additionally, there must be a cultural shift to support the fact that when the computer speaks up, providers should listen. Many hospitals still struggle today with implementing checklists because of cultural issues. There will need to be a similar cultural shift to enable passive omniscient computers to identify errors and warn providers.

I’m not aware of any omniscient computers that watch people all day and warn them that they’re about to make a mistake. There could be such software for workers in nuclear power plants or other critical jobs in which the cost of being wrong is devastating. If you know of any such software, please leave a comment.

April 9, 2014 I Written By

Kyle is Founder and CEO of Pristine, a company in Austin, TX that develops telehealth communication tools optimized for Google Glass in healthcare environments. Prior to founding Pristine, Kyle spent years developing, selling, and implementing electronic medical records (EMRs) into hospitals. He also writes for EMR and HIPAA, TechZulu, and Svbtle about the intersections of healthcare, technology, and business. All of his writing is reproduced at kylesamani.com

Healthcare Data Centers and Cloud Solutions

Written by:

As a former system administrator that worked in a number of data centers, it’s been really interesting for me to watch the evolution of healthcare data centers and the concept of healthcare cloud solutions. I think we’re seeing a definite switch by many hospital CIOs towards the cloud and away from the hassle and expense of trying to run their own data centers. Plus, this is facilitated greatly by the increased reliability, speed, and quality of the bandwidth that’s available today. Sure, the largest institutions will still have their own data centers, but even those organizations are working with an outside data center as well.

I had a chance to sit down for a video interview with Jason Mendenhall, Executive Vice President, Cloud at Switch Supernap to discuss the changing healthcare data center and cloud environment. We cover a lot of ground in the interview including when someone should use cloud infrastructure and when they shouldn’t. We talk about the security and reliability of a locally hosted data center versus an outside data center. We also talk a little about why Las Vegas is a great place for them to have their data center.

If you’re a healthcare organization who needs a data center (Translation: All of you) or if you’re a healthcare IT company that needs to host your application (Translation: All of you), then you’ll learn a lot from this interview with Jason Mendenhall:

As a side note, the Switch Supernap’s Innevation Center is the location for the Health IT Marketing and PR Conference I’m organizing April 7-8, 2014 in Las Vegas. If you’re attending the conference, we can also set you up for a tour of the Switch Supernap while you’re in Vegas. The tour is a bit like visiting a tech person’s Disneyland. They’ve created something really amazing when it comes to data centers. I know since a secure text message company I advise, docBeat, is hosted there with one of their cloud partners Itrica.

March 4, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 15 blogs containing almost 6000 articles with John having written over 3000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 14 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John launched two new companies: InfluentialNetworks.com and Physia.com, and is an advisor to docBeat. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and Google Plus. Healthcare Scene can be found on Google+ as well.

iPad Lifecycle Versus Other Tablets

Written by:

Every once in a while I like to put my old IT hat back on (I am @techguy on Twitter after all) and look at some of the more physical IT aspects of EMR and healthcare IT. I still get really excited about EMR Technology products and the evolution of these products.

I’ve long argued that most IT administrators would much rather have a set of Windows 8 tablets in their environment over a bunch of iPads or Android tablets. The biggest reasons for this was because of the security and management of these devices. Most hospital and healthcare IT administrators are comfortable securing a Windows based device and they aren’t as comfortable with new tablets like the iPad or Android tablets. Plus, the tools for managing and imaging Windows based tablets is so much more developed than those of the iPad or Android (although, I think both of these are catching up pretty quickly).

While I think both of these arguments are reasonable, I heard two new arguments for why an organization might want to stick with Windows 8 tablets instead of moving to iPads and Androids.

The first reason is that the lifecycle of a Windows 8 machine is much longer than an iPad or Android tablet. A Windows 8 tablet that you bought 5 years ago could still easily be supported by an IT shop and will work with your various software systems. A 3 year old iPad could very well not work with your EHR software and Apple has already stopped supporting O/S upgrades on the original iPads which poses similar HIPAA Compliance issues to Windows XP.

The whole release cycle with iPad and Android tablets is intent on replacing the previous versions. They don’t quite make them obsolete, but they’ve been releasing new versions every year with the intent for you to buy a new one every year. This stands in stark contrast to the Windows tablet approach.

Another reason many IT admins will likely lean towards Windows 8 tablets over iPads and Androids is that they’re just generally more rugged. Sure, you can make iPads and Androids more rugged with certain cases, but then you lose the look and feel of having an iPad in your hand and nicely in your pocket.

This point is accentuated even more when you look at devices like the new Toughpad tablets from Panasonic. They’ve finally got the processing power in these machines to match that of a desktop so they can run any software you want. Plus, they are crazy durable. I saw them at CES last month and a journalist from India was slamming it on the ground and stepping on it and the thing kept ticking without a problem. I don’t need to explain to any of you why durability matters in healthcare where you’re always carrying around multiple items and drops are common.

Of course, the reality is that it’s “sexy” to carry around an iPad while you work. Software vendors are going to continue developing for the iPad and doctors are going to want to be carrying an iPad around with them. IT staff are likely going to have to support iPads and other tablets in their environment. However, when it’s left to the IT staff, you can be sure that the majority of them will be pushing for the more rugged, easier to secure, easier to manage, and longer lifecycle Windows 8 tablets. Unless of course, they’re ordering an iPad for their own “test” environment.

February 13, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 15 blogs containing almost 6000 articles with John having written over 3000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 14 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John launched two new companies: InfluentialNetworks.com and Physia.com, and is an advisor to docBeat. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and Google Plus. Healthcare Scene can be found on Google+ as well.

Barcodes, Integrating Bedside TVs, EHR, and Nurse Messaging Into Pain Management Workflow

Written by:

As I look across the healthcare IT landscape, I believe we’re just at the beginning of a real integrated solutions that leverage everything that technology can offer. However, I see it starting to happen. A good example of this was this case study I found on “Automated Workflow for Pain Management.”

The case study goes into the details of the time savings and other benefits of proper pain management in the hospital. However, I was really intrigued by their integration of bedside TVs together with barcodes, EHR software, and nurse messaging (sadly they used a pager, but that could have easily been replaced with secure messaging). What a beautiful integration and workflow across so many different technologies from different companies and this is just the start.

One major challenge to these workflows is making these external applications work with the EHR software. Hopefully things like the blog post I wrote yesterday will help solve that problem. Case studies like the one above illustrate really well the value of outside software applications being able to integrate with EHR software.

What I also loved about the above solution is that it doesn’t cause any more work for the hospital staff. In fact, in many ways it can save them time. The nurse can have much higher quality data about who needs them and when.

This implementation is also a preview of what Kyle Samani talked about in his post “Unlocking the Power of Data Science in Healthcare.” While Kyle wrote about it from the perspective of patients and getting them the right information in the right context, the same applies to healthcare providers. The case study above is an example of this shift. No doubt there will be some resistance to these technologies in healthcare, but once they get refined we’ll wonder how we lived without them.

February 7, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 15 blogs containing almost 6000 articles with John having written over 3000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 14 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John launched two new companies: InfluentialNetworks.com and Physia.com, and is an advisor to docBeat. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and Google Plus. Healthcare Scene can be found on Google+ as well.

Why Will Medical Professionals Use Laptops?

Written by:

Steve Jobs famously said that “laptops are like trucks. They’re going to be used by fewer and fewer people. This transition is going to make people uneasy.”

Are medical professionals truck drivers or bike riders?

We have witnessed truck drivers turn into bike riders in almost every computing context:

Big businesses used to buy mainframes. Then they replaced mainframes with mini computers. Then they replaced minicomputers with desktops and servers. Small businesses began adopting technology in meaningful ways once they could deploy a local server and clients at reasonable cost inside their businesses. As web technologies exploded and mobile devices became increasingly prevalent, large numbers of mobile professionals began traveling with laptops, tablets and smartphones. Over the past few years, many have even stopped traveling with laptops; now they travel with just a tablet and smartphone.

Consumers have been just as fickle, if not more so. They adopted build-it-yourself computers, then Apple IIs, then mid tower desktops, then laptops, then ultra-light laptops, and now smartphones and tablets.

Mobile is the most under-hyped trend in technology. Mobile devices – smartphones, tablets, and soon, wearables – are occupying an increasingly larger percentage of total computing time. Although mobile devices tend to have smaller screens and fewer robust input methods relative to traditional PCs (see why the keyboard and mouse are the most efficient input methods), mobile devices are often preferred because users value ease of use, mobility, and access more than raw efficiency.

The EMR is still widely conceived of as a desktop-app with a mobile add-on. A few EMR companies, such as Dr Chrono, are mobile-first. But even in 2014, the vast majority of EMR companies are not mobile-first. The legacy holdouts cite battery, screen size, and lack of a keyboard as reasons why mobile won’t eat healthcare. Let’s consider each of the primary constraints and the innovations happening along each front:

Battery – Unlike every other computing component, batteries are the only component that aren’t doubling in performance every 2-5 years. Battery density continues to improve at a measly 1-2% per year. The battery challenge will be overcome through a few means: huge breakthroughs in battery density, and increasing efficiency in all battery-hungry components: screens and CPUs. We are on the verge of the transition to OLED screens, which will drive an enormous improvement in energy efficiency in screens. Mobile CPUs are also about to undergo a shift as OEM’s values change: mobile CPUs have become good enough that the majority of future CPU improvements will emphasize battery performance rather than increased compute performance.

Lack of a keyboard – Virtual keyboards will never offer the speed of physical keyboards. The laggards miss the point that providers won’t have to type as much. NLP is finally allowing people to speak freely. The problem with keyboards aren’t the characteristics of the keyboard, but rather the existential presence of the keyboard itself. Through a combination of voice, natural-language-processing, and scribes, doctors will type less and yet document more than ever before. I’m friends with CEOs of at least half a dozen companies attempting to solve this problem across a number of dimensions. Given how challenging and fragmented the technology problem is, I suspect we won’t see a single winner, but a variety of solutions each with unique compromises.

Screen size – We are on the verge of foldable, bendable, and curved screens. These traits will help resolve the screen size problem on touch-based devices. As eyeware devices blossom, screen size will become increasingly trivial because eyeware devices have such an enormous canvas to work with. Devices such as the MetaPro and AtheerOne will face the opposite problem: data overload. These new user interfaces can present extremely large volumes of robust data across 3 dimensions. They will mandate a complete re-thinking of presentation and user interaction with information at the point of care.

I find it nearly impossible to believe that laptops have more than a decade of life left in clinical environments. They simply do not accommodate the ergonomics of care delivery. As mobile devices catch up to PCs in terms of efficiency and perceived screen size, medical professionals will abandon laptops in droves.

This begs the question: what is the right form factor for medical professionals at the point of care?

To tackle this question in 2014 – while we’re still in the nascent years of wearables and eyeware computing – I will address the question “what software experiences should the ideal form factor enable?”

The ideal hardware* form factor of the future is:

Transparent: The hardware should melt away and the seams between hardware and software should blur. Modern tablets are quite svelte and light. There isn’t much more value to be had by improving portability of modern tablets; users simply can’t perceive the difference between .7lb and .8lb tablets. However, there is enormous opportunity for improvements in portability and accessibility when devices go handsfree.

Omni-present, yet invisible: There is way too much friction separating medical professionals from the computers that they’re interacting with all day long: physical distance (even the pocket is too far) and passwords. The ideal device of the future is friction free. It’s always there and always authenticated. In order to always be there, it must appear as if it’s not there. It must be transparent. Although Glass isn’t there just yet, Google describes the desired paradox eloquently when describing Glass: “It’s there when you need it, and out of sight when you don’t.” Eyeware devices will trend this way.

Interactive: despite their efficiency, PC interfaces are remarkably un-interactive. Almost all interaction boils down to a click on a pixel location or a keyboard command. Interacting with healthcare information in the future will be diverse and rich: natural physical movements, subtle winks, voice, and vision will all play significant roles. Although these interactions will require some learning (and un-learning of bad behaviors) for existing staff, new staff will pick them up and never look back.

Robust: Mobile devices of the future must be able to keep up with medical professionals. The devices must have shift-long battery life and be able to display large volumes of complex information at a glance.

Secure: This is a given. But I’ll emphasize this is as physical security becomes increasingly important in light of the number of unencrypted hospital laptops being stolen or lost.

Support 3rd party communications: As medicine becomes increasingly complex, specialized, and team-based, medical professionals will share even more information with one another, patients, and their families. Medical professionals will need a device that supports sharing what they’re seeing and interacting with.

I’m fairly convinced (and to be fair, highly biased as CEO of a Glass-centric company) that eyeware devices will define the future of computer interaction at the point of care. Eyeware devices have the potential to exceed tablets, smartphones, watches, jewelry, and laptops across every dimension above, except perhaps 3rd party communication. Eyeware devices are intrinsically personal, and don’t accommodate others’ prying eyes. If this turns out to be a major detriment, I suspect the problem will be solved through software to share what you’re seeing.

What do you think? What is the ideal form factor at the point of care?

*Software tends to dominate most health IT discussions; however, this blog post is focused on ergonomics of hardware form factors. As such, this list avoids software-centric traits such as context, intelligence, intuition, etc.

February 4, 2014 I Written By

Kyle is Founder and CEO of Pristine, a company in Austin, TX that develops telehealth communication tools optimized for Google Glass in healthcare environments. Prior to founding Pristine, Kyle spent years developing, selling, and implementing electronic medical records (EMRs) into hospitals. He also writes for EMR and HIPAA, TechZulu, and Svbtle about the intersections of healthcare, technology, and business. All of his writing is reproduced at kylesamani.com

Eyes Wide Shut – January, 2014 Meaningful Use Stage 2 Readiness Reality Check

Written by:

Happy New Year?

As I begin the 2014 Meaningful Use measures readiness assessment and vendor cat-herding exercises, I’m reflecting on this portion of CMS’s Director of E-Health Standards and Services, Robert Tagalicod and the ONC’s Acting National Coordinator Jacob Reider’s statement regarding the Meaningful Use timeline modification: “The goal of this change is two-fold: first, to allow CMS and ONC to focus efforts on the successful implementation of the enhanced patient engagement, interoperability and health information exchange requirements in Stage 2.” (Previously published on EMRandHIPAA.com.)

I call BS.

If the “goal” is a “successful implementation”, then CMS failed miserably by not addressing the START of the quarterly attestation period for Stage 2, which is still required in 2014. CMS and the ONC need more time to successfully implement the measures, and they are bureaucratic agencies that don’t directly deal with patient medical care. Why wasn’t the additional time required to truly succeed at this monumental task extended to the healthcare provider organizations? Because the agencies want to save face, and avoid litigation from early adopters who may be already beginning their 2014 attestation period amidst heroic back-breaking efforts?

Here’s a reality check for what a large IDN might be going through in early January, in preparation for the start of the 2014 quarterly attestation period. Assume this particular IDN’s hospitals’ fiscal year runs October-September, so you MUST begin your attestation period on July 1. You have 6 months.

As of December 31, 2013, only 4 of the 8 EMRs in your environment completed their 2014 CEHRT certification.

Each of those 4 EMRs has a different schedule to implement the upgrade to the certified edition, with staggered delivery dates from March to July. The hospital EMR is not scheduled to receive its certified-edition upgrade until April. You pray that THIS implementation is the exception to your extensive experience with EMR vendor target timelines extending 6-8 weeks beyond initial dates.

The EMR upgrades do not include the Direct module configuration, and the vendor’s Direct module resources are not available until 6-9 weeks after the baseline upgrade implementation – if they have knowledgeable resources, at all. Your hospital EMR vendor can’t articulate the technical infrastructure required to implement and support its own Direct module. Several vendors indicate that the Direct module configuration will have to be negotiated with a third-party. Your clinicians don’t know what Direct is. Your IT staff doesn’t know how to register with a HISP. Your EMR vendor doesn’t support a central Direct address directory or a lookup function, so you contemplate typing classes for your HIM and clinical staff.

The number of active patient problems requiring manual SNOMED remediation exceeds 60,000 records in your hospital EMR. You form a clinical committee to address, but they’re estimating it will take 6 months of review to complete. You’re contemplating de-activating all problems older than a certain date, which would whittle down the number and shorten the timeframe to complete – but would eliminate chronic conditions.

There are still nagging questions regarding CMS interpretation of the measures, so you ask for clarification, and you wait. And wait. And wait. The answers impact the business rules required for attestation reporting, and you know you need any help you can get in whittling down the denominator values. Do deceased patients count in the view/download/transmit denominator? If records access is prohibited by state/federal law, does that encounter count in the view/download/transmit denominator?

Consultant costs skyrocket as you struggle to find qualified SME resources to blaze a trail for your internal staff. Their 60-to-90-day assessments inevitably end with recommendations for “proof of concept” and “pilot” approaches to each of the 2014 measures, which don’t take into account the reality of the EMR upgrade timelines and the looming attestation start date. Following their recommendations would delay your attestation start by 9-12 months. So, your internal staff trudges forward without expert leadership, and you throw the latest PowerPoint deck from “Health IT Professionals-R-Us” on the pile.

Who needs testing, when you can go live with unproven technology the day it’s available in order to meet an arbitrary deadline? Healthcare.gov did it – look what a success that turned out to be!

But wait, this is real clinical data, generated by real-world clinical workflows, being used to treat real patients, by real healthcare providers. By refusing to address the start of the 2014 attestation period, CMS and the ONC are effectively using these patients and providers as lab rats.

I did not give permission to be part of this experiment.

January 13, 2014 I Written By

Mandi Bishop is a healthcare IT consultant and a hardcore data geek with a Master's in English and a passion for big data analytics, who fell in love with her PCjr at 9 when she learned to program in BASIC. Individual accountability zealot, patient engagement advocate, innovation lover and ceaseless dreamer. Relentless in pursuit of answers to the question: "How do we GET there from here?" More byte-sized commentary on Twitter: @MandiBPro.

Epic Builds Lab Installations At Oregon University

Written by:

Epic Systems has agreed to build two lab installations of its EpicCare EMR at the Oregon Health & Science University, one to be used for medical informatics education, and the other giving the school access its source code on the research side, reports Healthcare IT News.

Though the school’s OHSU Healthcare system already runs EpicCare for its hospitals and clinics, students and teachers have had to rely on a basic installation of the open-source VistA system for OSHU’s EMR laboratory course.

According to HIN, this is Epic’s first partnership with an academic informatics program, and potentially an important turning point for the company, which has conducted research and development almost exclusively on its Verona, Wis. campus. (It does release its source code to commercial customers.) And the agreement didn’t come easily; In fact, the school spent several years persuading Epic to participate before it agreed to commit to an academic partnership, Healthcare IT News said.

In a press statement, OSHU notes that the EpicCare research environment should allow students to delve into usability, data analytics, simulation, interoperability,  patient safety and more. The school also expects to prepare prototypes of solutions to to real-world healthcare problems.

Students in both OHSU’s on-campus and distance learning programs will pursue coursework based on the Epic EMR, with classes using the live Epic environment beginning March 2014. Work students will undertake include learning to configure screens, implementing clinical decision support and generating reports.

While this isn’t quite the same thing, this agreement brings to mind a blog item by John in which he describes how prospective programmer hires at Elation are required to shadow a physician as part of their hiring process. In both cases, the people who will be working with the software are actually getting an idea of how the product is used in the field before they’re out serving commercial clients. Sadly, that’s still rare.

I think this will ultimately be a win for both Epic and OSHU. Epic will get a fresh set of insights into its product, and students will be prepared for a real world in which Epic plays a major part.

November 27, 2013 I Written By

Katherine Rourke is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.