Eyes Wide Shut – Teaching to the Meaningful Use Stage 2 Test

Posted on September 30, 2013 I Written By

Mandi Bishop is a hardcore health data geek with a Master's in English and a passion for big data analytics, which she brings to her role as Dell Health’s Analytics Solutions Lead. She fell in love with her PCjr at 9 when she learned to program in BASIC. Individual accountability zealot, patient engagement advocate, innovation lover and ceaseless dreamer. Relentless in pursuit of answers to the question: "How do we GET there from here?" More byte-sized commentary on Twitter: @MandiBPro.

According to Twitter analytics, one of my more engaging tweets recently stated that Meaningful Use is stifling innovation by requiring that health IT vendors and healthcare providers employ very specific tactics to capture and report on clinical data capture and interoperability standards compliance – ostensibly to engage and empower the patient, and improve coordination of care between providers. Of course, I said it much more succinctly than that. In effect, conforming to the Meaningful Use Stage 2 attestation measures is akin to “teaching to the test”:

Here’s a real-world example of what it means to “teach to the test” of Meaningful Use. In order to qualify for CMS incentive dollars, Meaningful Use Stage 2 Year 1 patient engagement measures must be met, with auditable data captured, in a 90-day contiguous period in 2014. An eligible provider (EP) must demonstrate that 50% of all patients with encounters during that time period have online access to their clinical summary within 4 days of the data becoming available to the provider. 5% of those patients must access the clinical information within the 90 days, and 5% of those patients must leverage secure messaging to communicate relevant health information with the provider. Finally, the MU-certified EMR must proffer patient-specific education materials for 10% of the patients seen during that time.

What I believe the ONC had in mind when they crafted these measures: engaged patients who will log in to their portal after each encounter, review the findings and lab results to assess their own progress and outcomes, read or listen to the condition-specific educational materials provided that resonate with them, and ask more meaningful questions of their providers as a result of this new-found, data-enabled empowerment. That is why they categorize these measures as “patient engagement”, right?

Wrong. This is what “patient engagement” looks like, from the EMR implementation, Meaningful Use-consultant, EP business process standpoint.

First, establish the bare minimum thresholds for meeting the measures. If the EP saw 1000 patients during the same 3-month period the previous year, your denominator is 1000; calculate the numerator for each measure based on that. So, we need 500 patients to have access to their clinical data online; 50 patients must access their information; 50 patients must communicate with their provider via secure messaging; 100 patient encounters must prompt specific educational opportunities.

To meet the 500 patients with online access to their clinical data, patient portal software is preloaded with patient demographic accounts, based on the registration data already available in the EMR. An enrollment request is emailed to the patient or authorized representative (assuming an email address is available in their demographic information). The EMR captures the event of sending this email, which contains the information about how to enroll and access the patient’s medical records via the portal. This measure is met, without the patient acknowledging the portal’s existing, and without any direct communication between provider and patient.

The medical records view and secure messaging measures can be met simultaneously, in a matter of days, by planning to add a few extra minutes to each encounter for 50 patients’ worth of appointments. The EMR has already triggered an email with portal enrollment information to each of the patients in the waiting room on a given day. As the medical assistant (MA) is taking vital stats, she asks whether the patient has enrolled in the portal. It’s likely the patient has not; the MA hands the patient a tablet and has him log in to his email, and walks him through the portal enrollment and initial login process. Once logged in, the MA directs the patient to click the link to view his medical record. That click is recorded, and the “view” measure is met; whether a CCD or C-CCD is actually displayed is irrelevant to the attestation data capture.

Having demonstrated how a patient can view his record, the MA then asks the patient to go into the portal’s message center, to send a test communication to the provider. The patient completes the required fields, and the MA prompts him with a generic health-related question to type into the body of the message. Once the patient hits “Send”, the event is recorded, and the “secure messaging” measure is met.

For all patients, whether portal-users or not, a new process begins when the MA finishes, the provider enters the room and begins her evaluation of each of the 100 patients required to meet the education measure. As the patient talks, the provider is clicking through EMR workflow screens, recording the encounter data. The EMR occasionally prompts with a dialogue box indicating educational materials are available for patients with this diagnosis code, or this lab result. Each dialogue box prompt is recorded by the EMR; the “patient-specific education” measure is met, whether the provider acts on the prompt and discusses or distributes the educational information or not.

To put it simply: the patient never has to log in to a portal to meet the 50% online availability requirement, they don’t have to actually view their records to meet the 5% view requirement, they don’t have to have an actual message exchange with their provider to meet the 5% communication requirement, and they don’t have to receive any tailored materials to meet the 10% education requirement. Once those clicks have been recorded, the actions never have to be repeated; meaningful and ongoing patient engagement is not needed to meet the attestation requirements and receive the incentive dollars.

In a previous post, I introduced my interpretation of the difference between the spirit and letter of the Meaningful Use “law”. By teaching to the test, we’re addressing the letter of the law, only, in its narrowest interpretation. When will we incent vendors and providers to go above and beyond and find ways to truly engage patients in meaningful ways, empowering them with accurate, timely data access and tools to analyze it?