Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

Moving from “Reporting on” to “Leading” Healthcare – A Conversation with Dr. Halee Fischer-Wright, President & CEO of MGMA

Posted on October 11, 2017 I Written By

Colin Hung is the co-founder of the #hcldr (healthcare leadership) tweetchat one of the most popular and active healthcare social media communities on Twitter. Colin speaks, tweets and blogs regularly about healthcare, technology, marketing and leadership. He is currently an independent marketing consultant working with leading healthIT companies. Colin is a member of #TheWalkingGallery. His Twitter handle is: @Colin_Hung.

In Chapter 3 of Dr. Halee Fischer-Wright’s new book Back to Balance, she writes: “People are increasingly being treated as if they are the same. Science and data are being used to decrease variability in an attempt to get doctors to treat patients in predictable ways.” This statement is Fischer-Wright’s way of saying that the current focus on standardization of healthcare processes in the quest to reduce costs and increase quality may not be the brass ring we should be striving for. She believes that a balance is needed between healthcare standardization and the fact that each patient is a unique individual.

As president of the Medical Group Management Association (MGMA), a role Fischer-Wright has held since 2015, she is uniquely positioned to see first-hand the impact standardization (from both legislative and technological forces) has had on the medical profession. With over 40,000 members, MGMA represents many of America’s physician practices – a group particularly hard hit over the past few years by the technology compliance requirements of Meaningful Use and changes to reimbursements.

For many physician practices Meaningful Use has turned out to be more of a compliance program rather than an incentive program. To meet the program’s requirements, physicians have had to alter their workflows and documentation approaches. Complying with the program and satisfying the reporting requirements became the focus, which Fischer-Wright believes is a terrible unintended consequence.

“We have been so focused on standardizing the way doctors work that we have taken our eyes off the real goal,” said Fischer-Wright in and interview with HealthcareScene. “As physicians our focus needs to be on patient outcomes not whether we documented the encounter in a certain way. In our drive to mass standardization, we are in danger of ingraining the false belief that populations of patients behave in the same way and can be treated through a single standardized treatment regimen. That’s simply not the case. Patients are unique.”

Achieving a balance in healthcare will not be easy – a sentiment that permeates Back to Balance, but Fischer-Wright is certain that healthcare technology will play a key role: “We need HealthIT companies to stop focusing just on what can be done and start working on enabling what needs to be done. Physicians want to leverage technology to deliver better care to patient at a lower cost, but not at the expense of the patient/physician relationship. Let’s stop building tools that force doctors to stare at the computer screen instead of making eye contact with their patients.”

To that end, Fischer-Wright issued a friendly challenge to the vendors in the MGMA17 exhibit hall: “Create products and services that physicians actually enjoy using. Help reduce barriers between physician, patients and between healthcare organizations. Empower care don’t detract from it.”

She went on to say that MGMA itself will be stepping up to help champion the cause of better HealthIT for patients AND physicians. In fact, Fischer-Wright was excited to talk about the new direction for MGMA as an organization. For most of its history, MGMA has reported on the healthcare industry from a physician practice perspective. Over the past year with the help of a supportive Board of Directors and active members, the MGMA leadership team has begun to shift the organization to a more prominent leadership role.

“We are going to take a much more active role in healthcare. We are going to focus on fixing healthcare from the ground up –  from providers & patients upwards. In the next few years MGMA will be much bigger, much strong and even more relevant to physician practices. We are forging partnerships with other key players in healthcare, federal/state/local governments and other associations/societies.“

Members should expect more conferences, more educational opportunities and more publications on a more frequent basis from MGMA going forward. Fischer-Wright also hinted at several new technology-related offerings but opted not to provide details. Looking at the latest news from MGMA on their revamped data-gathering/analytics, however, it would not be surprising if their new offerings were data related. MGMA is one of the few organizations that regularly collects information on and provides context on the state of physician practices in the US.

It will be exciting to watch MGMA evolve in the years ahead.

Exchange Value: A Review of Our Bodies, Our Data by Adam Tanner (Part 3 of 3)

Posted on January 27, 2017 I Written By

Andy Oram is an editor at O’Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space.

Andy also writes often for O’Reilly’s Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O’Reilly’s Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

The previous part of this article raised the question of whether data brokering in health care is responsible for raising or lower costs. My argument that it increases costs looks at three common targets for marketing:

  • Patients, who are targeted by clinicians for treatments they may not need or have thought of

  • Doctors, who are directed by pharma companies toward expensive drugs that might not pay off in effectiveness

  • Payers, who pay more for diagnoses and procedures because analytics help doctors maximize charges

Tanner flags the pharma industry for selling drugs that perform no better than cheaper alternatives (Chapter 13, page 146), and even drugs that are barely effective at all despite having undergone clinical trials. Anyway, Tanner cites Hong Kong and Europe as places far more protective of personal data than the United States (Chapter 14, page 152), and they don’t suffer higher health care costs–quite the contrary.

Strangely, there is no real evidence so far that data sales have produced either harm to patients or treatment breakthroughs (Conclusion, 163). But the supermarket analogy does open up the possibility that patients could be induced to share anonymized data voluntarily by being reimbursed for it (Chapter 14, page 157). I have heard this idea aired many times, and it fits with the larger movement called Vendor Relationship Management. The problem with such ideas is the close horizon limiting our vision in a fast-moving technological world. People can probably understand and agree to share data for particular research projects, with or without financial reimbursement. But many researchers keep data for decades and recombine it with other data sets for unanticipated projects. If patients are to sign open-ended, long-term agreements, how can they judge the potential benefits and potential risks of releasing their data?

Data for sale, but not for treatment

In Chapter 11, Tanner takes up the perennial question of patient activists: why can drug companies get detailed reports on patient conditions and medications, but my specialist has to repeat a test on me because she can’t get my records from the doctor who referred me to her? Tanner mercifully shields here from the technical arguments behind this question–sparing us, for instance, a detailed discussion of vagaries in HL7 specifications or workflow issues in the use of Health Information Exchanges–but strongly suggests that the problem lies with the motivations of health care providers, not with technical interoperability.

And this makes sense. Doctors do not have to engage in explicit “blocking” (a slippery term) to keep data away from fellow practitioners. For a long time they were used to just saying “no” to requests for data, even after that was made illegal by HIPAA. But their obstruction is facilitated by vendors equally uninterested in data exchange. Here Tanner discards his usual pugilistic journalism and gives Judy Faulkner an easy time of it (perhaps because she was a rare CEO polite enough to talk to him, and also because she expressed an ethical aversion to sharing patient data) and doesn’t air such facts as the incompatibilities between different Epic installations, Epic’s tendency to exchange records only with other Epic installations, and the difficulties it introduces toward companies that want to interconnect.

Tanner does not address a revolution in data storage that many patient advocates have called for, which would at one stroke address both the Chapter 11 problem of patient access to data and the book’s larger critique of data selling: storing the data at a site controlled by the patient. If the patient determined who got access to data, she would simply open it to each new specialist or team she encounters. She could also grant access to researchers and even, if she chooses, to marketers.

What we can learn from Chapter 9 (although Tanner does not tell us this) is that health care organizations are poorly prepared to protect data. In this woeful weakness they are just like TJX (owner of the T.J. Maxx stores), major financial institutions, and the Democratic National Committee. All of these leading institutions have suffered breaches enabled by weak computer security. Patients and doctors may feel reluctant to put data online in the current environment of vulnerability, but there is nothing special about the health care field that makes it more vulnerable than other institutions. Here again, storing the data with the individual patient may break it into smaller components and therefore make it harder for attackers to find.

Patient health records present new challenges, but the technology is in place and the industry can develop consent mechanisms to smooth out the processes for data exchange. Furthermore, some data will still remain with the labs and pharmacies that have to collect it for financial reasons, and the Supreme Court has given them the right to market that data.

So we are left with ambiguities throughout the area of health data collection. There are few clear paths forward and many trade-offs to make. In this I agree ultimately with Tanner. He said that his book was meant to open a discussion. Among many of us, the discussion has already started, and Tanner provides valuable input.

Exchange Value: A Review of Our Bodies, Our Data by Adam Tanner (Part 2 of 3)

Posted on January 26, 2017 I Written By

Andy Oram is an editor at O’Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space.

Andy also writes often for O’Reilly’s Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O’Reilly’s Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

The previous part of this article summarized the evolution of data brokering in patient information and how it was justified ethically and legally, partly because most data is de-identified. Now we’ll take a look at just what that means.

The identified patient

Although doctors can be individually and precisely identified when they prescribe medicines, patient data is supposedly de-identified so that none of us can be stigmatized when trying to buy insurance, rent an apartment, or apply for a job. The effectiveness of anonymization or de-identification is one of the most hotly debated topics in health IT, and in the computer field more generally.

I have found a disturbing split between experts on this subject. Computer science experts don’t just criticize de-identification, but speak of it as something of a joke, assuming that it can easily be overcome by those with a will to do so. But those who know de-identification best (such as the authors of a book I edited, Anonymizing Health Data) point out that intelligent, well-designed de-identification databases have been resistant to cracking, and that the highly publicized successes in re-identification have used databases that were de-identified unprofessionally and poorly. That said, many entities (including the South Korean institutions whose practices are described in Chapter 10, page 110 of Tanner’s book) don’t call on the relatively rare experts in de-identification to do things right, and therefore fall into the category of unprofessional and poor de-identification.

Tanner accurately pinpoints specific vulnerabilities in patient data, such as the inclusion of genetic information (Chapter 9, page 96). A couple of companies promise de-identified genetic data (Chapter 12, page 130, and Conclusion, page 162), which all the experts agree is impossible due to the wide availability of identified genomes out in the field for comparison (Conclusion, page 162).

Tanner has come down on the side of easy re-identification, having done research in many unconventional areas lacking professional de-identification. However, he occasionally misses a nuance, as when describing the re-identification of people in the Personal Genome Project (Chapter 8 page 92). The PGP is a uniquely idealistic initiative. People who join this project relinquish interest in anonymity (Chapter 9, page 96), declaring their willingness to risk identification in pursuit of the greater good of finding new cures.

In the US, no legal requirement for anonymization interferes with selling personal data collected on social media sites, from retailers, from fitness devices, or from genetic testing labs. For most brokers, no ethical barriers to selling data exist either, although Apple HealthKit bars it (Chapter 14 page 155). So more and more data about our health is circulating widely.

With all these data sets floating around–some supposedly anonymized, some tightly tied to your identity–is anonymization dead? Every anonymized data set already contains a few individuals who can be theoretically re-identified; determining this number is part of the technical process of de-identification? Will more and more of us fall into this category as time goes on, victims of advanced data mining and the “mosaic effect” (combining records from different data sets)? This is a distinct possibility for the future, but in the present, there are no examples of re-identifying data that is anonymized properly–the last word properly being all important here. (The authors of Anonymizing Health Data talk of defensible anonymization, meaning you can show you used research-vetted processes.) Even Latanya Sweeney, whom Tanner tries to portray in Chapter 9 as a relentless attacker who strips away the protections of supposedly de-identified data, believes that data can be shared safely and anonymously.

To address people’s fretting over anonymization, I invoke the analogy of encryption. We know that our secret keys can be broken, given enough computing power. Over the decades, as Moore’s Law and the growth of large computing clusters have increased computing power, the recommended size of keys has also grown. But someday, someone will assemble the power (or find a new algorithm) that cracks our keys. We know this, yet we haven’t stopped using encryption. Why give up the benefits of sharing anonymized data, then? What hurts us is the illegal data breaches that happen on average more than once a day, not the hypothetical re-identification of patients.

To me, the more pressing question is what the data is being used for. No technology can be assessed outside of its economic and social context.

Almighty capitalism

One lesson I take from the creation of a patient data market, but which Tanner doesn’t discuss, is its existence as a side effect of high costs and large inefficiencies in health care generally. In countries that put more controls on doctors’ leeway to order drugs, tests, and other treatments, there is less wiggle room for the marketing of unnecessary or ineffective products.

Tanner does touch on the tendency of the data broker market toward monopoly or oligopoly. Once a company such as IMS Health builds up an enormous historical record, competing is hard. Although Tanner does not explore the affect of size on costs, it is reasonable to expect that low competition fosters padding in the prices of data.

Thus, I believe the inflated health care market leaves lots of room for marketing, and generally props up the companies selling data. The use of data for marketing may actually hinder its use for research, because marketers are willing to pay so much more than research facilities (Conclusion, pages 163-164).

Not everybody sells the data they collect. In Chapter 13, Tanner documents a complicated spectrum for anonymized data, ranging from unpublicized sales to requiring patient consent to forgoing all data sales (for instance, footnote 6 to Chapter 13 lists claims by Salesforce.com and Surescripts not to sell patient information). Tenuous as trust in reputation may seem, it does offer some protection to patients. Companies that want to be reputable make sure not to re-identify individual patients (Chapter 7, page 72, Chapter 9, pages 88-90, and Chapter 9, page 99). But data is so valuable that even companies reluctant to enter that market struggle with that decision.

The medical field has also pushed data collectors to make data into a market for all comers. The popular online EHR, Practice Fusion, began with a stable business model offering its service for a monthly fee (Chapter 13, page 140). But it couldn’t persuade doctors to use the service until it moved to an advertising and data-sharing model, giving away the service supposedly for free. The American Medical Association, characteristically, has also found a way to extract profit from sale of patient data, and therefore has colluded in marketing to doctors (Chapter 5, page 41, and Chapter 6, page 54).

Thus, a Medivo executive makes a good argument (Chapter 13, page 147) that the medical field benefits from research without paying for the dissemination of data that makes research possible. Until doctors pony up for this effort, another source of funds has to support the collection and research use of data. And if you believe that valuable research insights come from this data (Chapter 14, page 154, and Conclusion, page 166), you are likely to develop some appreciation for the market they have created. Another obvious option is government support for the collection and provision of data for research, as is done in Britain and some Nordic countries, and to a lesser extent in the US (Chapter 14, pages 158-159).

But another common claim, aired in this book by a Cerner executive (Chapter 13, page 143) is that giving health data to marketers reduces costs across the system, similarly to how supermarkets grant discounts to shoppers willing to have their purchases tracked. I am not convinced that costs are reduced in either case. In the case of supermarkets, their discounts may persuade shoppers to spend more money on expensive items than they would have otherwise. In health care, the data goes to very questionable practices. These become the topic of the last part of this article.

Exchange Value: A Review of Our Bodies, Our Data by Adam Tanner (Part 1 of 3)

Posted on January 25, 2017 I Written By

Andy Oram is an editor at O’Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space.

Andy also writes often for O’Reilly’s Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O’Reilly’s Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

A lot of people are feeling that major institutions of our time have been compromised, hijacked, or perverted in some way: journalism, social media, even politics. Readers of Adam Tanner’s new book, Our Bodies, Our Data: How Companies Make Billions Selling Our Medical Records, might well add health care data to that list.

Companies collecting our data–when they are not ruthlessly trying to keep their practices secret–hammer us with claims that this data will improve care and lower costs. Anecdotal evidence suggests it does. But the way this data is used now, it serves the business agendas of drug companies and health care providers who want to sell us treatments we don’t need. When you add up the waste of unnecessary tests and treatments along with the money spent on marketing, as well as the data collection that facilitates that marketing, I’d bet it dwarfs any savings we currently get from data collection.

How we got to our current data collection practices

Tanner provides a bit of history of data brokering in health care, along with some intriguing personalities who pushed the industry forward. At first, there was no economic incentive to collect data–even though visionary clinicians realized it could help find new diagnoses and treatments. Tanner says that the beginnings of data collection came with the miracle drugs developed after World War II. Now that pharmaceutical companies had a compelling story to tell, ground-breaking companies such as IMS Health (still a major player in the industry) started to help them target physicians who had both the means of using their drugs–that is, patients with the target disease–and an openness to persuasion.

Lots of data collection initiatives started with good intentions, some of which paid off. Tanner mentions, as one example, a computer program in the early 1970s that collected pharmacy data in the pursuit of two laudable goals (Chapter 2, page 13): preventing patients from getting multiple prescriptions for the same drug, and preventing adverse interactions between drugs. But the collection of pharmacy data soon found its way to the current dominant use: a way to help drug companies market high-profit medicines to physicians.

The dual role of data collection–improving care but taking advantage of patients, doctors, and payers–persists over the decades. For instance, Tanner mentions a project by IMS Health (which he treats pretty harshly in Chapter 5) collecting personal data from AIDS patients in 1997 (Chapter 7, page 70). Tanner doesn’t follow through to say what IMS did with the AIDS data, but I am guessing that AIDS patients don’t offer juicy marketing opportunities, and that this initiative was aimed at improving the use and effectiveness of treatments for this very needy population. And Chapter 7 ends with a list of true contributions to patient health and safety created by collecting patient data.

Chapter 6 covers the important legal battles fought by several New England states (including the scrappy little outpost known for its worship of independent thinking, New Hampshire) to prevent pharmacies from selling data on what doctors are prescribing. These attempts were quashed by the well-known 2011 Supreme Court ruling on Vermont’s law. All questions of privacy and fairness were submerged by considering the sale of data to be a matter of free speech. As we have seen during several decisions related to campaign financing, the current Supreme Court has a particularly expansive notion of what the First Amendment covers. I just wonder what they will say when someone who breaks into the records of an insurer or hospital and steals several million patient records pleads free speech to override the Computer Fraud and Abuse Act.

Tanner has become intrigued, and even enamored, by the organization Patient Privacy Rights and its founder, Deborah Peel. I am closely associated with this organization and with Peel as well, working on some of their privacy summits and bringing other people into their circle. Because Tanner airs some criticisms of Peel, I’d like to proffer my own observation that she has made exaggerated and unfair criticisms of health IT in the past, but has moderated her views a great deal. Working with experts in health IT sympathetic to patient privacy, she has established Patient Privacy Rights during the 2010 decade as a responsible and respected factor in the health care field. So I counter Tanner’s repeated quotes regarding Peel as “crazy” (Chapter 8, page 83) by hailing her as a reputable and crucial force in modern health IT.

Coincidentally, Tanner refers (Chapter 8, page 79) to a debate that I moderated between IMS representative Kim Gray and Michelle De Mooy (available in a YouTube video). The discussion started off quite tame but turned up valuable insights during the question-and-answer period (starting at 38:33 in the video) about data sharing and the role of de-identification.

While the Supreme Court ruling stripped doctors of control over data about their practices–a bit of poetic irony, perhaps, if you consider their storage of patient data over the decades as an unjust taking–the question of patient rights was treated as irrelevant. The lawyer for the data miners said, “The patients have nothing to do with this” (Chapter 6, page 57) and apparently went unchallenged. How can patients’ interest in their own data be of no concern? For that question we need to look at data anonymization, also known as de-identification. This will begin the next section of our article.

The Pain of Recording Patient Risk Factors as Illuminated by Apixio (Part 2 of 2)

Posted on October 28, 2016 I Written By

Andy Oram is an editor at O’Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space.

Andy also writes often for O’Reilly’s Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O’Reilly’s Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

The previous section of this article introduced Apixio’s analytics for payers in the Medicare Advantage program. Now we’ll step through how Apixio extracts relevant diagnostic data.

The technology of PDF scraping
Providers usually submit SOAP notes to the Apixio web site in the form of PDFs. This comes to me as a surprise, after hearing about the extravagant efforts that have gone into new CCDs and other formats such as the Blue Button project launched by the VA. Normally provided in an XML format, these documents claim to adhere to standards and offer a relatively gentle face to a computer program. In contrast, a PDF is one of the most challenging formats to parse: words and other characters are reduced to graphical symbols, while layout bears little relation to the human meaning of the data.

Structured documents such as CCDs contain only about 20% of what CMS requires, and often are formatted in idiosyncratic ways so that even the best CCDs would be no more informative than a Word document or PDF. But the main barrier to getting information, according to Schneider, is that Medicare Advantage works through the payers, and providers can be reluctant to give payers direct access to their EHR data. This reluctance springs from a variety of reasons, including worries about security, the feeling of being deluged by requests from payers, and a belief that the providers’ IT infrastructure cannot handle the burden of data extraction. Their stance has nothing to do with protecting patient privacy, because HIPAA explicitly allows providers to share patient data for treatment, payment, and operations, and that is what they are doing giving sensitive data to Apixio in PDF form. Thus, Apixio had to master OCR and text processing to serve that market.

Processing a PDF requires several steps, integrated within Apixio’s platform:

  1. Optical character recognition to re-create the text from a photo of the PDF.

  2. Further structuring to recognize, for instance, when the PDF contains a table that needs to be broken up horizontally into columns, or constructs such the field name “Diagnosis” followed by the desired data.

  3. Natural language processing to find the grammatical patterns in the text. This processing naturally must understand medical terminology, common abbreviations such as CHF, and codings.

  4. Analytics that pull out the data relevant to risk and presents it in a usable format to a human coder.

Apixio can accept dozens of notes covering the patient’s history. It often turns up diagnoses that “fell through the cracks,” as Schneider puts it. The diagnostic information Apixio returns can be used by medical professionals to generate reports for Medicare, but it has other uses as well. Apixio tells providers when they are treating a patient for an illness that does not appear in their master database. Providers can use that information to deduce when patients are left out of key care programs that can help them. In this way, the information can improve patient care. One coder they followed could triple her rate of reviewing patient charts with Apixio’s service.

Caught between past and future
If the Apixio approach to culling risk factors appears round-about and overwrought, like bringing in a bulldozer to plant a rosebush, think back to the role of historical factors in health care. Given the ways doctors have been taught to record medical conditions, and available tools, Apixio does a small part in promoting the progressive role of accountable care.

Hopefully, changes to the health care field will permit more direct ways to deliver accountable care in the future. Medical schools will convey the requirements of accountable care to their students and teach them how to record data that satisfies these requirements. Technologies will make it easier to record risk factors the first time around. Quality measures and the data needed by policy-makers will be clarified. And most of all, the advantages of collaboration will lead providers and payers to form business agreements or even merge, at which point the EHR data will be opened to the payer. The contortions providers currently need to go through, in trying to achieve 21st-century quality, reminds us of where the field needs to go.

The Pain of Recording Patient Risk Factors as Illuminated by Apixio (Part 1 of 2)

Posted on October 27, 2016 I Written By

Andy Oram is an editor at O’Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space.

Andy also writes often for O’Reilly’s Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O’Reilly’s Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

Many of us strain against the bonds of tradition in our workplace, harboring a secret dream that the industry could start afresh, streamlined and free of hampering traditions. But history weighs on nearly every field, including my own (publishing) and the one I cover in this blog (health care). Applying technology in such a field often involves the legerdemain of extracting new value from the imperfect records and processes with deep roots.

Along these lines, when Apixio aimed machine learning and data analytics at health care, they unveiled a business model based on measuring risk more accurately so that Medicare Advantage payments to health care payers and providers reflect their patient populations more appropriately. Apixio’s tools permit improvements to patient care, as we shall see. But the core of the platform they offer involves uploading SOAP notes, usually in PDF form, and extracting diagnostic codes that coders may have missed or that may not be supportable. Machine learning techniques extract the diagnostic codes for each patient over the entire history provided.

Many questions jostled in my mind as I talked to Apixio CTO John Schneider. Why are these particular notes so important to the Centers for Medicare & Medicaid Services (CMS)? Why don’t doctors keep track of relevant diagnoses as they go along in an easy-to-retrieve manner that could be pipelined straight to Medicare? Can’t modern EHRs, after seven years of Meaningful Use, provide better formats than PDFs? I asked him these things.

A mini-seminar ensued on the evolution of health care and its documentation. A combination of policy changes and persistent cultural habits have tangled up the various sources of information over many years. In the following sections, I’ll look at each aspect of the documentation bouillabaisse.

The financial role of diagnosis and risk
Accountable care, in varying degrees of sophistication, calculates the risk of patient populations in order to gradually replace fee-for-service with payments that reflect how adeptly the health care provider has treated the patient. Accountable care lay behind the Affordable Care Act and got an extra boost at the beginning of 2016 when CMS took on the “goal of tying 30 percent of traditional, or fee-for-service, Medicare payments to alternative payment models, such as ACOs, by the end of 2016 — and 50 percent by the end of 2018.

Although many accountable care contracts–like those of the much-maligned 1970s Managed Care era–ignore differences between patients, more thoughtful programs recognize that accurate and fair payments require measurement of how much risk the health care provider is taking on–that is, how sick their patients are. Thus, providers benefit from scrupulously complete documentation (having learned that upcoding and sloppiness will no longer be tolerated and will lead to significant fines, according to Schneider). And this would seem to provide an incentive for the provider to capture every nuance of a patient’s condition in a clearly code, structured way.

But this is not how doctors operate, according to Schneider. They rebel when presented with dozens of boxes to check off, as crude EHRs tend to present things. They stick to the free-text SOAP note (fields for subjective observations, objective observations, assessment, and plan) that has been taught for decades. It’s often up to post-processing tools to code exactly what’s wrong with the patient. Sometimes the SOAP notes don’t even distinguish the four parts in electronic form, but exist as free-flowing Word documents.

A number of key diagnoses come from doctors who have privileges at the hospital but come in only sporadically to do consultations, and who therefore don’t understand the layout of the EHR or make attempts to use what little structure it provides. Another reason codes get missed or don’t easily surface is that doctors are overwhelmed, so that accurately recording diagnostic information in a structured way is a significant extra burden, an essentially clerical function loaded onto these highly skilled healthcare professionals. Thus, extracting diagnostic information many times involves “reading between the lines,” as Schneider puts it.

For Medicare Advantage payments, CMS wants a precise delineation of properly coded diagnoses in order to discern the risk presented by each patient. This is where Apixio come in: by mining the free-text SOAP notes for information that can enhance such coding. We’ll see what they do in the next section of this article.

Has Electronic Health Record Replacement Failed?

Posted on June 23, 2016 I Written By

The following is a guest blog post by Justin Campbell, Vice President, Galen Healthcare.
Justin Campbell
A recent Black Book survey of hospital executives and IT employees who have replaced their Electronic Health Record system in the past three years paints a grim picture. Respondents report higher than expected costs, layoffs, declining revenues, disenfranchised clinicians and serious misgivings about the benefits of switching systems. Specifically:

  • 14% of all hospitals that replaced their original EHR since 2011 were losing inpatient revenue at a pace that wouldn’t support the total cost of their replacement EHR
  • 87% of hospitals facing financial challenges now regret the decision to change systems
  • 63% of executive level respondents admitted they feared losing their jobs as a result of the EHR replacement process
  • 66% of system users believe that interoperability and patient data exchange functionality have declined

Surely, this was not the outcome expected when hospitals rushed to replace paper records in response to Congressional incentives (and penalties) included in the 2009 American Recovery and Reinvestment Act.

But the disappointment reflected in this survey only sheds light on part of the story. The majority of hospitals depicted here were already in financial difficulty. It is understandable that they felt impelled to make a significant change and to do so as quickly as possible. But installing an electronic record system, or replacing one that is antiquated, requires much more than a decision to do so. We should not be surprised that a complex undertaking like this would be burdened by complicated and confusing challenges, chief among which turned out to be “usability” and acceptance.

Another Black Book report, this one from 2013, revealed:

  • 66% of doctors using EHR systems did not do so willingly
  • 87% of those unwilling to use the system claimed usability as their primary complaint
  • 84% of physician groups chose their EHR to reach meaningful use incentives
  • 92% of practices described their EHR as “clunky” and/or difficult to use

None of this should surprise us but we need to ask: was usability really the key driver for EHR replacement? Is usability alone accountable for lost revenue, employment anxiety and buyers’ remorse? Surely organizations would not have dumped millions into failed EHR implementations only to rip-and-replace them due to usability problems and provider dissatisfaction. Indeed, despite the persistence of functional obstacles such as outdated technology, hospitals continue to make new EMR purchases. Maybe the “reason for the rip-and-replace approach by some hospitals is to reach interoperability between inpatient and outpatient data,” wrote Dr. Donald Voltz, MD in EMR and EHR.

Interoperability is linked to another one of the main drivers of EHR replacement: the mission to support value-based care, that is, to improve the delivery of care by streamlining operations and facilitating the exchange of health information between a hospital’s own providers and the caregivers at other hospitals or health facilities. This can be almost impossible to achieve if hospitals have legacy systems that include multiple and non-communicative EHRs.

As explained by Chief Nurse Executive Gail Carlson, in an article for Modern Healthcare, “Interoperability between EHRs has become crucial for their successful integration of operations – and sometimes requires dumping legacy systems that can’t talk to each other.

Many hospitals have numerous ancillary services, each with their own programs. The EHRs are often “best of breed.” That means they employ highly specialized software that provides excellent service in specific areas such as emergency departments, obstetrics or lab work. But communication between these departments is compromised because they display data differently.

In order to judge EHR replacement outcomes objectively, one needs to not just examine the near-term financials and sentiment (admittedly, replacement causes disruption and is not easy), but to also take a holistic view of the impact to the system’s portfolio by way of simplification and future positioning for value-based care. The majority of the negative sentiment and disappointing outcomes may actually stem from the migration and new system implementation process in and of itself. Many groups likely underestimated the scope of the undertaking and compromised new system adoption through a lackluster migration.

Not everyone plunged into the replacement frenzy. Some pursued a solution such as dBMotion to foster care for patients via intercommunications across all care venues. In fact, Allscripts acquired dBMotion to solve for interoperability between its inpatient solution (Eclipsys SCM) and its outpatient EMR offering (Touchworks). dBMotion provides a solution for those organizations with different inpatient and outpatient vendors, offering semantic interoperability, vocabulary management, EMPI and ultimately facilitating a true community-based record.

Yet others chose to optimize what they had, driven by financial constraints. There is a thin line separating EHR replacement from EHR optimization. This is especially true for those HCOs that are neither large enough nor sufficiently funded to be able to afford a replacement; they are instead forced to squeeze out the most value they can from their current investment.

The optimization path is much more pronounced with MEDITECH clients, where a large percentage of their base remains on the legacy MAGIC and C/S platforms.

Denni McColm, a hospital CIO, told healthsystemCIO why many MEDITECH clients are watching and waiting before they commit to a more advanced platform:

“We’re on MEDITECH’s Client/Server version, which is not their older version and not their newest version, and we have it implemented really everywhere that MEDITECH serves. So we have the hospital systems, home care, long-term care, emergency services, surgical center — all the way across the continuum. We plan to go to their latest version sometime in the next few years to get the ambulatory interface for the providers. It should be very efficient — reduced clicks, it’s mobile friendly, and our docs are anxious to move to it,” but we’ll decide when the time is right, she says.

What can we discern from these different approaches and studies?  It’s too early to be sure of the final score. One thing is certain though: the migrations and archival underpinnings of system replacement are essential. They allow the replacement to deliver on the promise of improved usability, enhanced interoperability and take us closer to the goal of value-based care.

About Justin Campbell
Justin is Vice President, Strategy, at Galen Healthcare Solutions. He is responsible for market intelligence, segmentation, business and market development and competitive strategy. Justin has been consulting in Health IT for over 10 years, guiding clients in the implementation, integration and optimization of clinical systems. He has been on the front lines of system replacement and data migration and is passionate about advancing interoperability in healthcare and harnessing analytical insights to realize improvements in patient care. Justin can be found on Twitter at @TJustinCampbell

Can Online Self-Scheduling Really Change the ER and Urgent Care Experience? – Communication Solutions Series

Posted on June 9, 2016 I Written By

The following is a guest blog post by Laura Alabed-Olsson, Marketing Manager of Stericycle Communication Solutions, as part of the Communication Solutions Series of blog posts. Follow and engage with them on Twitter:@StericycleComms
Laura Alabed-Olsson
As a part of the team behind online self-scheduling solution InQuicker, I am asked this question a lot. When you’re dealing with the sickest of the sick of patients, can online self-scheduling really make a difference? Yes, it can. Let’s begin by looking at things from a patient’s perspective.

Imagine you’re sick. Really sick. You haven’t showered in a day or so. You’re in your pajamas and buried under the covers in your bed. Even if your favorite ER or urgent care is the best of the best – think big-screen TV, a beverage bar and a tall stack of your favorite magazines – wouldn’t you rather wait from home than this palace of a waiting room? Online self-scheduling makes this possible. You simply go to your preferred provider’s website. Select an estimated treatment time. Provide some basic information. And then you wait from home until it’s time to be seen. It’s that easy. (You still feel crummy, but at least a little bit happy that you won’t have to wait long when you get there, right?)

Now, let’s look at it from a provider’s perspective. With online self-scheduling, you have the benefit of knowing who’s coming in, why they’re seeking care, and when they’ll arrive – giving you plenty of time to prepare space and allocate resources. Online self-scheduling supports operational efficiency, big time.

Running behind and fearful that you can’t see a self-scheduled patient at their estimated treatment time? No problem. Just let them know when you’ll be ready, so that they can adjust their timing. Then, bask in the glow of knowing that when they do arrive, they’re certain to be happier than they would have been had they been sitting in the waiting room the entire time. Talk about getting the patient-provider relationship off on the right foot!

Today’s patients want – and increasingly expect – a patient-centric approach to healthcare. Online self-scheduling supports this (along with patient acquisition and retention, operational efficiency and care coordination). In fact, across the clients that use InQuicker for their online scheduling needs, we see patient satisfaction rates that average 90 percent.

Yes, online self-scheduling really can change (and improve) the ER/urgent care experience. Do you want happy patients and happy providers? Online self-scheduling could be the answer.

The Communication Solutions Series of blog posts is sponsored by Stericycle Communication Solutions, a leading provider of high quality telephone answering, appointment scheduling, and automated communication services. Stericycle Communication Solutions combines a human touch with innovative technology to deliver best-in-class communication services. Connect with Stericycle Communication Solutions on social media: @StericycleComms

Healthcare in an E-Commerce World – Communication Solutions Series

Posted on April 14, 2016 I Written By

The following is a guest blog post by Laura Alabed-Olsson, Marketing Manager of Stericycle Communication Solutions, as part of the Communication Solutions Series of blog posts. Follow and engage with them on Twitter:@StericycleComms
Laura Alabed-Olsson
These days, it seems as though I can’t pickup an industry publication, or even a major daily newspaper, without finding at least one article on healthcare consumerism. Consumers want to shop for healthcare the way they shop for TVs and cars, they say. Consumers expect cost information, quality ratings and anytime access, too, they tell us.

All of this makes me wonder: For healthcare providers that have long operated in a traditional, not so consumer-centric world, where does one begin? Results from a handful of recent surveys offer some insights:

  1. More than 40% of consumers say that information found via social media affects the way they deal with their health.
  1. 77% of consumers use online reviews, often found on sites like Yelp and Healthgrades, as their first step in finding a new doctor.
  1. 56% of consumers have actively looked for healthcare cost information before getting care; 21% of these have compared prices across multiple providers.
  1. Consumers expect the same online service in healthcare that they see in other industries, and they will switch providers to get it.

So, let’s dig in.

Insight #1: 40% of consumers turn to social media for healthcare information. This statistic may not come as a surprise, especially when you consider the number of patients sitting in waiting rooms – or restaurants or coffee shops or wherever  – with phone in hand, endlessly scrolling Facebook, Twitter or Instagram.  What is surprising is how relatively few healthcare providers are pursuing this captive audience with educational content that accurately informs consumers about health-related issues (while simultaneously addressing demands for a “connected” experience). Is your organization leveraging social media to educate and engage with patients? Perhaps it should be.

Insight #2: 77% of consumers look to online reviews when choosing a provider. To further validate this point: Did you know that Healthgrades.com, the for-profit site that shares a variety of information about physicians, hospitals and other provider organizations, gets a million hits a day? Clearly, consumers have an appetite for information on patient satisfaction and clinical outcomes. Is this information readily available on your organization’s website? If you don’t provide it, others will and, in doing so, they are poised to steer prospective patients elsewhere.

Insight #3: 56% of consumers are paying attention to healthcare costs. While the idea of comparison shopping for healthcare is a relatively new one, it’s one that consumers and providers alike must embrace (consumers, because they’re increasingly accountable for a greater share of out-of-pocket costs, and providers, because cost transparency is the new norm – and if you want to effectively compete with traditional providers, retail clinics, telemedicine, docs-on-demand and whatever comes next, you’ve just got to get onboard). Is your organization empowering patients to make thoughtful decisions? A cost estimator on your website – or even a promise to have cost information available when patients request it – could make for a great start.

Insight #4: Healthcare consumers want an online experience that mirrors what’s being offered by retailers like Amazon, Southwest Airlines and OpenTable. When consumers want to book an airline ticket or reserve a table at their favorite restaurant, they don’t have to pick up the phone and call between 9 a.m. and 5 p.m. They hop online when it’s convenient for them and, in just a few clicks, they’ve gotten what they want. Why should healthcare be any different? By offering online self-scheduling on your website, you’re giving patients 24/7 access to care – and you’re doing it in a way that is familiar and convenient for them. Does your organization offer a way for consumers to access care when and how they want to? Research suggests it should.

Healthcare consumerism requires a significant shift in how providers serve patients, for sure. But in just a few, small steps – like those mentioned here – you can be on your way.

The Communication Solutions Series of blog posts is sponsored by Stericycle Communication Solutions, a leading provider of high quality telephone answeringappointment scheduling, and automated communication services. Stericycle Communication Solutions combines a human touch with innovative technology to deliver best-in-class communication services.  Connect with Stericycle Communication Solutions on social media: @StericycleComms

Small Practice Marketing Strategies Twitter Chat (#KareoChat)

Posted on April 12, 2016 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Health IT Marketing and PR Awards 2016

Last week we held the Healthcare IT Marketing and PR conference which is organized by Healthcare Scene. By all accounts, the conference ran well and the feedback I’ve gotten is that people really enjoyed the event and the healthcare marketing and PR community we’ve built. During the event, we held the HITMC Awards and Kareo won the award for Best Social Media Program. This is a well deserved honor since they put a lot of work into hosting the weekly #KareoChat.

Coming out of the conference, Kareo asked me if there were some topics from the conference that would work well for the #KareoChat audience of small practice physicians. After reviewing the sessions at the conference, I realized that there was a lot of lessons from the conference that could be applied to small practice marketing. In fact, so many of the topics could be a #KareoChat of their own. With that said, they asked if I’d host this week’s #KareoChat based on topics from the conference. So, I decided to pull together a potpourri of topics that applied well to small practices.

Kareo Chat - HITMC

Here’s a look at the topics for this week’s #KareoChat:

  1. When and why should a physician practice go through a rebranding? #KareoChat @HealthITMKTG
  2. How can you use your and your competitors’ online reviews (good and bad) to your benefit? #KareoChat @mdeiner
  3. Could small practices benefit from their own podcast? Is it worth it?  #KareoChat @GetSocialHealth @Resultant @jaredpiano
  4. How and when should small practices use visual content in their office? #KareoChat @csvishal2222
  5. How can the 4 communication preferences (Facts, Futures, Form, Feelings) help small physician practice marketing? #KareoChat @ChartCapture
  6. Where and how can we use the power of storytelling in small physician practice marketing? #KareoChat @ctrappe @stacygoebel

If you’d like to join us to discuss these topics, just follow the #KareoChat hashtag on Thursday, April 14th at Noon ET (9 AM PT). I expect it will be a really diverse and interesting chat across a wide variety of topics related to small practice marketing.

Full Disclosure: Kareo is an advertiser on one of the Healthcare Scene websites.