Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

Validic Survey Raises Hopes of Merging Big Data Into Clinical Trials

Posted on September 30, 2016 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

Validic has been integrating medical device data with electronic health records, patient portals, remote patient monitoring platforms, wellness challenges, and other health databases for years. On Monday, they highlighted a particularly crucial and interesting segment of their clientele by releasing a short report based on a survey of clinical researchers. And this report, although it doesn’t go into depth about how pharmaceutical companies and other researchers are using devices, reveals great promise in their use. It also opens up discussions of whether researchers could achieve even more by sharing this data.

The survey broadly shows two trends toward the productive use of device data:

  • Devices can report changes in a subject’s condition more quickly and accurately than conventional subject reports (which involve marking observations down by hand or coming into the researcher’s office). Of course, this practice raises questions about the device’s own accuracy. Researchers will probably splurge for professional or “clinical-grade” devices that are more reliable than consumer health wearables.

  • Devices can keep the subject connected to the research for months or even years after the end of the clinical trial. This connection can turn up long-range side effects or other impacts from the treatment.

Together these advances address two of the most vexing problems of clinical trials: their cost (and length) and their tendency to miss subtle effects. The cost and length of trials form the backbone of the current publicity campaign by pharma companies to justify price hikes that have recently brought them public embarrassment and opprobrium. Regardless of the relationship between the cost of trials and the cost of the resulting drugs, everyone would benefit if trials could demonstrate results more quickly. Meanwhile, longitudinal research with massive amounts of data can reveal the kinds of problems that led to the Vioxx scandal–but also new off-label uses for established medications.

So I’m excited to hear that two-thirds of the respondents are using “digital health technologies” (which covers mobile apps, clinical-grade devices, and wearables) in their trials, and that nearly all respondents plan to do so over the next five years. Big data benefits are not the only ones they envision. Some of the benefits have more to do with communication and convenience–and these are certainly commendable as well. For instance, if a subject can transmit data from her home instead of having to come to the office for a test, the subject will be much more likely to participate and provide accurate data.

Another trend hinted at by the survey was a closer connection between researchers and patient communities. Validic announced the report in a press release that is quite informative in its own right.

So over the next few years we may enter the age that health IT reformers have envisioned for some time: a merger of big data and clinical trials in a way to reap the benefits of both. Now we must ask the researchers to multiply the value of the data by a whole new dimension by sharing it. This can be done in two ways: de-identifying results and uploading them to public or industry-maintained databases, or providing identifying information along with the data to organizations approved by the subject who is identified. Although researchers are legally permitted to share de-identified information without subjects’ consent (depending on the agreements they signed when they began the trials), I would urge patient consent for all releases.

Pharma companies are already under intense pressure for hiding the results of trials–but even the new regulations cover only results, not the data that led to those results. Organizations such as Sage Bionetworks, which I have covered many times, are working closely with pharmaceutical companies and researchers to promote both the software tools and the organizational agreements that foster data sharing. Such efforts allow people in different research facilities and even on different continents to work on different aspects of a target and quickly share results. Even better, someone launching a new project can compare her data to a project run five years before by another company. Researchers will have millions of data points to work with instead of hundreds.

One disappointment in the Validic survey was a minority of respondents saw a return on investment in their use of devices. With responsible data sharing, the next Validic survey may raise this response rate considerably.

Please, No More HIE “Coming Of Age” Stories

Posted on September 29, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Today I read a Modern Healthcare story suggesting that health information exchanges are “coming of age,” and after reading it, I swear my eyes almost rolled back into my head. (An ordinary eye roll wouldn’t do.)

The story leads with the assertion that a new health data sharing deal, in which Texas Health Resources agreed to share data via a third-party HIE, suggests that such HIEs are becoming sustainable.

Author Joseph Conn writes that the 14-hospital system is coming together with 32 other providers sending data to Healthcare Access San Antonio, an entity which supports roughly 2,400 HIE users and handles almost 2.2 million patient records. He notes that the San Antonio exchange is one of about 150 nationwide, hardly a massive number for a country the size of the U.S.

In partial proof of his assertion that HIEs are finding their footing, he notes that that from 2010 to 2015, the number of HIEs in the U.S. fluctuated but saw a net gain of 41%, according to federal stats. And he attributes this growth to pressure on providers to improve care, lower costs and strengthen medical research, or risk getting Medicare or Medicaid pay cuts.

I don’t dispute that there is increased pressure on providers to meet some tough goals. Nor am I arguing that many healthcare organizations believe that healthcare data sharing via an HIE can help them meet these goals.

But I would argue that even given the admittedly growing pressure from federal regulators to achieve certain results, history suggests that an HIE probably isn’t the way to get this done, as we don’t seem to have found a business model for them that works over the long term.

As Conn himself notes, seven recipients of federal, state-wide HIE grants issued by the ONC — awarded in Connecticut, Illinois, Montana, Nevada, New Hampshire, Puerto Rico and Wyoming — went out of business after the federal grants dried up. So were not talking about HIEs’ ignoble history of sputtering out, we’re talking about fairly recent failures.

He also notes that a commercially-funded model, MetroChicago HIE, which connected more than 30 northeastern Illinois hospitals, went under earlier this year. This HIE failed because its most critical technology vendor suddenly went out of business with 2 million patient records in its hands.

As for HASA, the San Antonio exchange discussed above, it’s not just a traditional HIE. Conn’s piece notes that most of the hospitals in the Dallas-Fort Worth area have already implemented or plan to use an Epic EMR and share clinical messages using its information exchange capabilities. Depending on how robust the Epic data-sharing functions actually are, this might offer something of a solution.

But what seems apparent to me, after more than a decade of watching HIEs flounder, is that a data-sharing model relying on a third-party platform probably isn’t financially or competitively sustainable.

The truth is, a veteran editor like Mr. Conn (who apparently has 35 years of experience under his belt) must know that his reporting doesn’t sustain the assertion that HIEs are coming into some sort of golden era. A single deal undertaken by even a large player like Texas Health Resources doesn’t prove that HIEs are seeing a turnaround. It seems that some people think the broken clock that is the HIE model will be right at least once.

P.S.  All of this being said, I admit that I’m intrigued by the notion of  “public utility” HIE. Are any of you associated with such a project?

Is Your EHR Contributing to Physician Burnout?

Posted on September 28, 2016 I Written By

The following is a guest blog post by Sara Plampin, Senior Instructional Writer from The Breakaway Group (A Xerox Company). Check out all of the blog posts in the Breakaway Thinking series.
Sara Plampin - The Breakaway Group
It’s finally come, the day you’ve been working toward for years – Go Live. Thousands (or even millions) of dollars, hundreds of hours planning and calculating and going back to the drawing board, and it’s about to pay off. You sit back and take a breath, proudly watching as your organization takes its first steps into the future.

And then the complaints start to trickle in. The Electronic Health Record (EHR) feels clunky, it doesn’t match current workflows, documentation takes too long, and the physicians refuse to use it.

Frustrations over EHR functionality and increased documentation time are a leading cause of burnout among medical workers. Physician practices, in particular, are showing a decrease in EHR use over time. Physicians say hefty documentation requirements take away valuable face-to-face time with patients, making them feel more like scribes than doctors.

The issue has led to physician groups reviving the ‘Quadruple Aim’ movement, in which physician wellness is more emphasized.

quadruple-aim-of-healthcare-physician-wellness

While many are quick to attribute this dissatisfaction to the EHR itself, it is more likely the result of a poor implementation plan that focused more on technological requirements and less on long-term adoption needs. There are three ways to ensure the needs of physicians and clinical staff are met and you have a successful EHR adoption.

Involve Clinical Staff from the Get-Go
One of the biggest mistakes you can make is failing to include clinical staff in the initial decision-making process. Before choosing an EHR vendor, assemble a team of representatives from all areas of your organization – not just physicians and nurses. Ancillary departments such as therapy, radiology, and pharmacy are often overlooked when it comes to EHR design and training. Each representative will be aware of the specific needs and workflows for their department; they can compile requests from their colleagues and help research different vendor options to determine which EHR is the ideal match for your organization.

Once the EHR is selected, clinical staff members become an integral part of the design team. Although vendor representatives can help identify best practice workflows, ultimately your employees are the experts on how the EHR will be used in their department. HIMSS physicians cited five factors that contribute to EHR usability issues: navigation, data entry, structured documentation, interoperability, and clinical decision support. Involving clinicians in the design and testing phases allows them to identify solutions to some of these common issues, making the EHR more intuitive for future users.

Including members from all areas of the organization not only ensures better EHR selection and design – it also improves morale. When staff feel like their voices are heard, the project becomes a joint initiative rather than a regulation from upper management. Representatives from the design team act as a go-between, communicating their peers’ requests to executives, while in turn reinforcing the importance of the transition and garnering excitement for go live and beyond.

Realistic, Time-Effective Training
Once the EHR design is solid, the next step is to make sure all staff are properly trained and comfortable using the application. While this may seem obvious, training is another area where many organizations fall short. It is not just the amount of training that matters, but also the type and timing of training. Full-day classroom training sessions can be ineffective for adult learners. Additionally, planning training days around complicated shift schedules is difficult, as is finding replacement staff. This is particularly an issue at small physician practices, where physicians may have to sacrifice patient time in order to complete training.

A more modern, time-effective approach to training is online simulation. Learning is chunked into modules based on small tasks users may complete throughout their day. Thus, learning can be spread over days or weeks, whenever the physician has a free moment. Simulations allow learners to practice using the EHR, giving them the chance to fail without repercussions and develop muscle memory for daily tasks. By go live, using the EHR should feel like second nature.

A lot of the frustrations users feel about navigation and documentation requirements result from their unfamiliarity with the application. When they receive the right training, they will feel confident using the EHR, thus reducing documentation time and increasing face-to-face time with patients.

Constant Feedback/Reevaluation
As with all large-scale projects, even the best laid plans are bound to hit a snag or two. If you’ve established a solid communication channel with all department representatives, you will be prepared to handle any complaints that come your way after go live. It is important that all staff have a clear path to communicate problems and suggestions, and that they are comfortable doing so. The best way to avoid dissatisfaction among your employees is to hear their complaints and proactively fix these issues.

If you’ve already implemented an EHR and are now dealing with the types of complaints outlined above, this is the place for you to start. Create testing and measurement procedures to determine how users are currently using the EHR, where they are getting stuck and where their actions deviate from prescribed workflows. Then, work with each department to determine where EHR functionality can be tweaked, workflows redesigned or a combination of both. Effective adoption requires a constant cycle of communication, design, training, evaluation, and redesign.

If you want to make sure your employees are happy with the EHR and physicians avoid burnout, go live is just the beginning.

Xerox is a sponsor of the Breakaway Thinking series of blog posts.

The Required Shift in How Patients View Wearables

Posted on September 27, 2016 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

This post is sponsored by Samsung Business. All thoughts and opinions are my own.

We’ve all seen the explosive growth that’s occurred in the wearables market. The most extraordinary part of the wearables explosion is that the majority of wearables growth has been in the healthcare space. The problem we now see in healthcare is that most people don’t look at wearables as a disease management tool as much as they see them as lifestyle tools. This was described really well by Megan Williams on the Samsung Insights blog:

Perhaps the most challenging part of meeting that desire [Physician Access to Patients’ Lives and Health] is the fact that patients mostly view wearables as an aid in lifestyle improvement instead of disease management. The task of helping patients understand that wearables are about much more than weight loss will fall squarely on the shoulders of providers.

Patients have traditionally shown a preference for lifestyle apps including fitness, nutrition and heart rate aids, and have been much slower to adopt disease management tools, even as chronic disease remains a burden on healthcare as a whole. Encouraging the use of a broader range of wearables, digital tools and apps will be a challenge for any provider.

Changing habits and perceptions is always a challenge. However, it’s also a great opportunity.

No one would argue that today’s wearables are more than novelty items that may have some impact on your lifestyle (fitness, nutrition, etc). That’s largely because the initial wearables were designed around those retail areas of the market. It’s much easier to create a retail wearable device than to create a disease management focused healthcare device.

As the healthcare wearables market matures so will patients expectations around the benefits they can receive from those wearables. I think there are two main keys to development of wearables as true healthcare devices: Depth of Tracking and Connection to Providers.

Depth of Tracking
I’ve argued for a while now that all the various fitness trackers were not clinically relevant. I still believe that today, but I also believe that wearables like the various fitness trackers will start tracking us in ways that are clinically relevant. That just takes a lot longer to develop.

Whether it’s new trackers that screen for sleep apnea or ECGs that monitor our heart, we’re seeing more and more wearable devices monitoring data that’s more clinically relevant than the number of steps you’ve taken. This trend will continue. As wearables more deeply track various parts of the human body, the opportunities to understand your health and improve your health will follow along with it. This will provide doctors the impetus to request access to your wearable data.

The deep data these wearables will provide will challenge the tried and true beliefs healthcare holds so dearly today. That can be scary for some, but is also very exciting.

Connection to Providers
While wearables will provide the data, we’ll still want to consult a healthcare provider to understand the data and to create a plan of action based on that data. At least in the foreseeable future, our health will depend on collaboration with healthcare providers as opposed to a replacement of healthcare providers. This will be particularly true as the type of data our wearables collect gets more complicated. Understanding your step chart is quite different than understanding your ECG.

In order to facilitate this collaboration, our wearables will have to be connected to our care providers. Note that I said care providers and not doctors. In some cases it might be our doctor, but in other cases it could be a nurse, care manager, social worker, or some other care provider. I’m hopeful that we eventually reach the point of a true care team that collaborates on our health. That’s a far cry from where most of our healthcare is today, but that is the hope.

If we can solve these two wearable challenges: Deeper Data and Connected Providers, then we’ll be well on our way to changing how patients view wearables. This shift won’t happen over night, but I believe it will happen a lot quicker than most people imagine.

For more content like this, follow Samsung on Insights, Twitter, LinkedIn , YouTube and SlideShare.

As Patient Engagement Advances, It Raises Questions About Usefulness

Posted on September 26, 2016 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

Reading ONC’s recent summary of patient engagement capabilities at US hospitals left me feeling both hopeful and wistful. The ONC, as usual, is trying to show off how much progress the field of health IT has made since Meaningful Use started, and the statistics in this dashboard meet those goals. On the other hand, I look at the statistics and wonder when real patient empowerment will emerge from these isolated gains.

The ONC dashboard includes information both on raw data exchange–what Meaningful Use called view, download, and transmit (VDT)–and the uses of that data, which ultimately mean much more than exchange.

I considered at first how important I would find it to download hospital information. I certainly would like my doctors to get the results of tests performed there, and other information related to my status upon discharge, but these supposedly are sent to the primary care physician in a Continuity of Care Document (CCD). If I or a close relative of mine had a difficult or chronic condition, I would certainly benefit from VDT because I would have to be an active advocate and would need the documentation. My point here is that our real goal in health reform is coordinated care, rather than data transfer, and while VDT is an important first step, we must always ask who is using that information.

The ONC did not ask the hospitals how much of their data patients can download. God is in the details, and I am not confident that an affirmative answer to the question of downloading data means patients can get everything that is in their records. For instance, my primary care physician has a patient portal running on eClinicalWorks (not his choice, but the choice of the hospital to which he is affiliated). From this portal I can get only a few pieces of information, such as medications (which I happen to know already, since I am taking them) and lab results. Furthermore, I downloaded the CCD and ran it through a checker provided online by the ONC for a lark, and found that it earned D grades for accurate format. This dismal rating suggests that I couldn’t successfully upload the CCD to another doctor’s EHR.

Still, I don’t want to dismiss the successes in the report. VDT is officially enabled in 7 out of 10 hospitals, a 7-fold growth between 2013 and 2015. Although the dashboard laments that “Critical Access, medium, and small hospitals lag,” the lag is not all that bad. And the dashboard also shows advances in the crucial uses of that data, such as submitting amendments to the data

A critical question in evaluating patient engagement is how the Congress and ONC define it. A summary of the new MACRA law lists several aspects of patient engagement measured under the new system:

  • Viewing, downloading, and transmitting, as defined before. As with the later Meaningful Use requirements, MACRO requires EHRs to offer an API, so that downloading can be done automatically.

  • Secure messaging. Many advances in treating chronic conditions depend on regular communications with patients, and messaging is currently the simplest means toward that goal. Some examples of these advances can be found in my article about a health app challenge. Conventional text messaging is all in plain text, and health care messaging must be secure to meet HIPAA requirements.

  • Educational materials. I discount the impact of static educational materials offered to patients with chronic conditions, whether in the form print brochures or online. But educational materials are part of a coordinated care plan.

  • Incorporating patient-generated data. The MACRA requirements “ask providers to incorporate data contributed by the patient from at least one unique patient.” Lucky little bugger. How will he or she leverage this unprecedented advantage?

That last question is really the nub of the patient engagement issue. In Meaningful Use and MACRA, regulators often require a single instance of some important capability, because they know that once the health care provider has gone through the trouble of setting up that capability, extending it to all patients is less difficult. And it’s heartening to see that 37 percent of hospitals allowed patients to submit patient-generated data in 2015.

Before you accept data from a patient, you need extra infrastructure to make the data useful. For instance:

  • You can check for warning signals that call for intervention, such as an elevated glucose level. This capability suggests a background program running through all the data that comes in and flagging such warning signals.

  • You can evaluate device data to see progress or backsliding in the patient’s treatment program. This requires analytics that understand the meaning of the data (and that can handle noise) so as to produce useful reports.

  • You can create a population health program that incorporates the patient-generated data into activities such as monitoring epidemics. This is also a big analytical capability.

Yes, I’m happy we’ve made progress in using data for patient engagement. A lot of other infrastructure also needs to be created so we can benefit from the big investment these advances required.

The Burden of Structured Data: What Health Care Can Learn From the Web Experience (Part 2 of 2)

Posted on September 23, 2016 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

The first part of this article summarized what Web developers have done to structure data, and started to look at the barriers presented by health care. This part presents more recommendations for making structured data work.

The Grand Scheme of Things
Once you start classifying things, it’s easy to become ensnared by grandiose pipe dreams and enter a free fall trying to design the perfect classification system. A good system is distinguished by knowing its limitations. That’s why microdata on the Web succeeded. In other areas, the field of ontology is littered with the carcasses of projects that reached too far. And health care ontologies always teeter on the edge of that danger.

Let’s take an everyday classification system as an example of the limitations of ontology. We all use genealogies. Imagine being able to sift information about a family quickly, navigating from father to son and along the trail of siblings. But even historical families, such as royal ones, introduce difficulties right away. For instance, children born out of wedlock should be shown differently from legitimate heirs. Modern families present even bigger headaches. How do you represent blended families where many parents take responsibilities of different types for the children, or people who provided sperm or eggs for artificial insemination?

The human condition is a complicated one not subject to easy classification, and that naturally extends to health, which is one of the most complex human conditions. I’m sure, for instance, that the science of mosquito borne diseases moves much faster than the ICD standard for disease. ICD itself should be replaced with something that embodies semantic meaning. But constant flexibility must be the hallmark of any ontology.

Transgender people present another enormous challenge to ontologies and EHRs. They’re a test case for every kind of variation in humanity. Their needs and status vary from person to person, with no classification suiting everybody. These needs can change over time as people make transitions. And they may simultaneously need services defined for male and female, with the mix differing from one patient to the next.

Getting to the Point
As the very term “microdata” indicates, those who wish to expose semantic data on the Web can choose just a few items of information for that favored treatment. A movie theater may have text on its site extolling its concession stand, its seating, or its accommodations for the disabled, but these are not part of the microdata given to search engines.

A big problem in electronic health records is their insistence that certain things be filled out for every patient. Any item that is of interest for any class of patient must appear in the interface, a problem known in the data industry as a Cartesian explosion. Many observers counsel a “less is more” philosophy in response. It’s interesting that a recent article that complained of “bloated records” and suggested a “less is more” approach goes on to recommend the inclusion of scads of new data in the record, to cover behavioral and environmental information. Without mentioning the contradiction explicitly, the authors address it through the hope that better interfaces for entering and displaying information will ease the burden on the clinician.

The various problems with ontologies that I have explained throw doubt on whether EHRs can attain such simplicity. Patients are not restaurants. To really understand what’s important about a patient–whether to guide the clinician in efficient data entry or to display salient facts to her–we’ll need systems embodying artificial intelligence. Such systems always feature false positives and negatives. They also depend on continuous learning, which means they’re never perfect. I would not like to be the patient whose data gets lost or misclassified during the process of tuning the algorithms.

I do believe that some improvements in EHRs can promote the use of structured data. Doctors should be allowed to enter the data in the order and the manner they find intuitive, because that order and that manner reflect their holistic understanding of the patient. But suggestions can prompt them to save some of the data in structured format, without forcing them to break their trains of thought. Relevant data will be collected and irrelevant fields will not be shown or preserved at all.

The resulting data will be less messy than what we have in unstructured text currently, but still messy. So what? That is the nature of data. Analysts will make the best use of it they can. But structure should never get in the way of the information.

The Burden of Structured Data: What Health Care Can Learn From the Web Experience (Part 1 of 2)

Posted on September 22, 2016 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

Most innovations in electronic health records, notably those tied to the Precision Medicine initiative that has recently raised so many expectations, operate by moving clinical information into structure of one type or another. This might be a classification system such as ICD, or a specific record such as “medications” or “lab results” with fixed units and lists of names to choose from. There’s no arguing against the benefits of structured data. But its costs are high as well. So we should avoid repeating old mistakes. Experiences drawn from the Web may have something to teach the health care field in respect to structured data.

What Works on the Web
The Web grew out of a structured data initiative. The dream of organizing information goes back decades, and was embodied in Standard Generalized Markup Language (SGML) years before Tim Berners-Lee stole its general syntax to create HTML and present information on the Web. SGML could let a firm mark in its documents that FR927 was a part number whereas SG1 was a building. Any tags that met the author’s fancy could be defined. This put semantics into documents. In other words, the meaning of text could be abstracted from the the text and presented explicitly. Semantics got stripped out of HTML. Although the semantic goals of SGML were re-introduced into the HTML successor XML, it found only niche uses. Another semantic Web tool, JSON, was reserved for data storage and exchange, not text markup.

Since the Web got popular, people have been trying to reintroduce semantics into it. There was Dublin Core, then RDF, then microdata in places like schema.org–just to list a few. Two terms denoting structured data on the Web, the Semantic Web and Linked Data, have been enthusiastically taken up by the World Wide Web Consortium and Tim Berners-Lee himself.

But none of these structured data initiatives are widely known among the Web-browsing public, probably because they all take a lot of work to implement. Furthermore, they run into the bootstrapping problem faced by nearly all standards: if your web site uses semantics that aren’t recognized by the browser, they’re just dropped on the ground (or even worse, the browser mangles your web pages).

Even so, recent years have seen an important form of structured data take off. When you look up a movie or restaurant on a major search engine such a Google, Yahoo!, or Bing, you’ll see a summary of the information most people want to see: local showtimes for the movie, phone number and ratings for a restaurant, etc. This is highly useful (particularly on mobile devices) and can save you the trouble of visiting the web site from which the data comes. Google calls these summaries Rich Cards and Rich Snippets.

If my memory serves me right, the basis for these snippets didn’t come from standards committees involving years of negotiation between stake-holders. Google just decided what would be valuable to its users and laid out the standard. It got adopted because it was a win-win. The movie theaters and restaurants got their information right into the viewer’s face, and the search engine became instantly more valuable and more likely to be used again. The visitors doing the search obviously benefitted too. Everyone found it worth their time to implement the standards.

Interestingly, as structure moves into metadata, HTML itself is getting less semantic. The most recent standard, HTML5, did add a few modest tags such as header and footer. But many sites are replacing meaningful HTML markup, such as p for paragraph, with two ultra-generic tags: div for a division that is set off from other parts of the page, and span for a piece of text embedded within another. Formatting is expressed through CSS, a separate language.

Having reviewed a bit of Web history, let’s see what we can learn from it and apply to health care.

Make the Customer Happy
Win-win is the key to getting a standard adopted. If your clinician doesn’t see any benefit from the use of structured data, she will carp and bristle at any attempt to get her to enter it. One of the big reasons electronic health records are so notoriously hard to use is, “All those fields to fill out.” And while lists of medications or other structured data can help the doctor choose the right one, they can also help her enter serious errors–perhaps because she chose the one next to the one she meant to choose, or because the one she really wanted isn’t offered on the list.

Doctors’ resentment gets directed against every institution implicated in the structured data explosion: the ONC and CMS who demand quality data and other fields of information for their own inscrutable purposes, the vendor who designs up the clunky system, and the hospital or clinic that forces doctors to use it. But the Web experience suggests that doctors would fill out fields that would help them in their jobs. The use of structured data should be negotiated, not dictated, just like other innovations such as hand-washing protocols or checklists. Is it such a radical notion to put technology at the service of the people using it?

I know it’s frustrating to offer that perspective, because many great things come from collecting data that is used in analytics and can turn up unexpected insights. If we fill out all those fields, maybe we’ll find a new cure! But the promised benefit is too far off and too speculative to justify the hourly drag upon the doctor’s time.

We can fall back on the other hope for EHR improvement: an interface that makes data entry so easy that doctors don’t mind using structured fields. I have some caveats to offer about that dream, which will appear in the second part of this article.

Security and Privacy Are Pushing Archiving of Legacy EHR Systems

Posted on September 21, 2016 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

In a recent McAfee Labs Threats Report, they said that “On average, a company detects 17 data loss incidents per day.” That stat is almost too hard to comprehend. No doubt it makes HIPAA compliance officers’ heads spin.

What’s even more disturbing from a healthcare perspective is that the report identifies hospitals as the easy targets for ransomware and that the attacks are relatively unsophisticated. Plus, one of the biggest healthcare security vulnerabilities is legacy systems. This is no surprise to me since I know so many healthcare organizations that set aside, forget about, or de-prioritize security when it comes to legacy systems. Legacy system security is the ticking time bomb of HIPAA compliance for most healthcare organizations.

In a recent EHR archiving infographic and archival whitepaper, Galen Healthcare Solutions highlighted that “50% of health systems are projected to be on second-generation technology by 2020.” From a technology perspective, we’re all saying that it’s about time we shift to next generation technology in healthcare. However, from a security and privacy perspective, this move is really scary. This means that 50% of health systems are going to have to secure legacy healthcare technology. If you take into account smaller IT systems, 100% of health systems have to manage (and secure) legacy technology.

Unlike other industries where you can decommission legacy systems, the same is not true in healthcare where Federal and State laws require retention of health data for lengthy periods of time. Galen Healthcare Solutions’ infographic offered this great chart to illustrate the legacy healthcare system retention requirements across the country:
healthcare-legacy-system-retention-requirements

Every healthcare CIO better have a solid strategy for how they’re going to deal with legacy EHR and other health IT systems. This includes ensuring easy access to legacy data along with ensuring that the legacy system is secure.

While many health systems use to leave their legacy systems running off in the corner of their data center or a random desk in their hospital, I’m seeing more and more healthcare organizations consolidating their EHR and health IT systems into some sort of healthcare data archive. Galen Healthcare Solution has put together this really impressive whitepaper that dives into all the details associated with healthcare data archives.

There are a lot of advantages to healthcare data archives. It retains the data to meet record retention laws, provides easy access to the data by end users, and simplifies the security process since you then only have to secure one health data archive instead of multiple legacy systems. While some think that EHR data archiving is expensive, it turns out that the ROI is much better than you’d expect when you factor in the maintenance costs associated with legacy systems together with the security risks associated with these outdated systems and other compliance and access issues that come with legacy systems.

I have no doubt that as EHR vendors and health IT systems continue consolidating, we’re going to have an explosion of legacy EHR systems that need to be managed and dealt with by every healthcare organization. Those organizations that treat this lightly will likely pay the price when their legacy systems are breached and their organization is stuck in the news for all the wrong reasons.

Galen Healthcare Solutions is a sponsor of the Tackling EHR & EMR Transition Series of blog posts on Hospital EMR and EHR.

Can Machine Learning Tame Healthcare’s Big Data?

Posted on September 20, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Big data is both a blessing and a curse. The blessing is that if we use it well, it will tell us important things we don’t know about patient care processes, clinical improvement, outcomes and more. The curse is that if we don’t use it, we’ve got a very expensive and labor-hungry boondoggle on our hands.

But there may be hope for progress. One article I read today suggests that another technology may hold the key to unlocking these blessings — that machine learning may be the tool which lets us harvest the big data fields. The piece, whose writer, oddly enough, was cited only as “Mauricio,” lead cloud expert at Cloudwards.net, argues that machine learning is “the most effective way to excavate buried patterns in the chunks of unstructured data.” While I am an HIT observer rather than techie, what limited tech knowledge I possess suggests that machine learning is going to play an important role in the future of taming big data in healthcare.

In the piece, Mauricio notes that big data is characterized by the high volume of data, including both structured and non-structured data, the high velocity of data flowing into databases every working second, the variety of data, which can range from texts and email to audio to financial transactions, complexity of data coming from multiple incompatible sources and variability of data flow rates.

Though his is a general analysis, I’m sure we can agree that healthcare big data specifically matches his description. I don’t know if you who are reading this include wild cards like social media content or video in their big data repositories, but even if you don’t, you may well in the future.

Anyway, for the purposes of this discussion, let’s summarize by saying that in this context, big data isn’t just made of giant repositories of relatively normalized data, it’s a whirlwind of structured and unstructured data in a huge number of formats, flooding into databases in spurts, trickles and floods around the clock.

To Mauricio, an obvious choice for extracting value from this chaos is machine learning, which he defines as a data analysis method that automates extrapolated model-building algorithms. In machine learning models, systems adapt independently without any human interaction, using automatically-applied customized algorithms and mathematical calculations to big data. “Machine learning offers a deeper insight into collected data and allows the computers to find hidden patterns which human analysts are bound to miss,” he writes.

According to the author, there are already machine learning models in place which help predict the appearance of genetically-influenced diseases such as diabetes and heart disease. Other possibilities for machine learning in healthcare – which he doesn’t mention but are referenced elsewhere – include getting a handle on population health. After all, an iterative learning technology could be a great choice for making predictions about population trends. You can probably think of several other possibilities.

Now, like many other industries, healthcare suffers from a data silo problem, and we’ll have to address that issue before we create the kind of multi-source, multi-format data pool that Mauricio envisions. Leveraging big data effectively will also require people to cooperate across departmental and even organizational boundaries, as John Lynn noted in a post from last year.

Even so, it’s good to identify tools and models that can help get the technical work done, and machine learning seems promising. Have any of you experimented with it?

Will a Duo of AI and Machine Learning Catch Data Thieves Lurking in Hospital EHR Corridors?

Posted on September 19, 2016 I Written By

The following is a guest blog post by Santosh Varughese, President of Cognetyx, an organization devoted to using artificial intelligence and machine learning innovation to bring an end to the theft of patient medical data.
santosh-varughese-president-cognetyx
As Halloween approaches, the usual spate of horror movies will intrigue audiences across the US, replete with slashers named Jason or Freddie running amuck in the corridors of all too easily accessible hospitals. They grab a hospital gown and the zombies fit right in.  While this is just a movie you can turn off, the real horror of patient data theft can follow you.

(I know how terrible this type of crime can be. I myself have been the victim of a data theft by hackers who stole my deceased father’s medical files, running up more than $300,000 in false charges. I am still disputing on-going bills that have been accruing for the last 15 years).

Unfortunately, this horror movie scenario is similar to how data thefts often occur at medical facilities. In 2015, the healthcare industry was one of the top three hardest hit industries with serious data breaches and major attacks, along with government and manufacturers. Packed with a wealth of exploitable information such as credit card data, email addresses, Social Security numbers, employment information and medical history records, much of which will remain valid for years, if not decades and fetch a high price on the black market.

Who Are The Hackers?
It is commonly believed attacks are from outside intruders looking to steal valuable patient data and 45 percent of the hacks are external. However, “phantom” hackers are also often your colleagues, employees and business associates who are unwittingly careless in the use of passwords or lured by phishing schemes that open the door for data thieves. Not only is data stolen, but privacy violations are insidious.

The problem is not only high-tech, but also low-tech, requiring that providers across the continuum simply become smarter about data protection and privacy issues. Medical facilities are finding they must teach doctors and nurses not to click on suspicious links.

For healthcare consultants, here is a great opportunity to not only help end this industry wide problem, but build up your client base by implementing some new technologies to help medical facilities bring an end to data theft.  With EHRs being more vulnerable than ever before, CIOs and CISOs are looking for new solutions.  These range from thwarting accidental and purposeful hackers by implementing physical security procedures to securing network hardware and storage media through measures like maintaining a visitor log and installing security cameras. Also limiting physical access to server rooms and restricting the ability to remove devices from secure areas.

Of course enterprise solutions for the entire hospital system using new innovations are the best way to cast a digital safety net over all IT operations and leaving administrators and patients with a sense of security and safety.

Growing Nightmare
Medical data theft is a growing national nightmare.  IDC’s Health Insights group predicts that 1 in 3 healthcare recipients will be the victim of a medical data breach in 2016.  Other surveys found that in the last two years, 89% of healthcare organizations reported at least one data breach, with 79% reporting two or more breaches. The most commonly compromised data are medical records, followed by billing and insurance records. The average cost of a healthcare data breach is about $2.2 million.

At health insurer Anthem, Inc., foreign hackers stole up to 80 million records using social engineering to dig their way into the company’s network using the credentials of five tech workers. The hackers stole names, Social Security numbers and other sensitive information, but were thwarted when an Anthem computer system administrator discovered outsiders were using his own security credentials to log into the company system and to hack databases.

Investigators believe the hackers somehow compromised the tech worker’s security through a phishing scheme that tricked the employee into unknowingly revealing a password or downloading malicious software. Using this login information, they were able to access the company’s database and steal files.

Healthcare Hacks Spread Hospital Mayhem in Diabolical Ways
Not only is current patient data security an issue, but thieves can also drain the electronic economic blood from hospitals’ jugular vein—its IT systems. Hospitals increasingly rely on cloud delivery of big enterprise data from start-ups like iCare that can predict epidemics, cure disease, and avoid preventable deaths. They also add Personal Health Record apps to the system from fitness apps like FitBit and Jawbone.

Banner Health, operating 29 hospitals in Arizona, had to notify millions of individuals that their data was exposed. The breach began when hackers gained access to payment card processing systems at some of its food and beverage outlets. That apparently also opened the door to the attackers accessing a variety of healthcare-related information.

Because Banner Health says its breach began with an attack on payment systems, it differentiates from other recent hacker breaches. While payment system attacks have plagued the retail sector, they are almost unheard of by healthcare entities.

What also makes this breach more concerning is the question of how did hackers access healthcare systems after breaching payment systems at food/beverage facilities, when these networks should be completely separated from one another? Healthcare system networks are very complex and become more complicated as other business functions are added to the infrastructure – even those that don’t necessarily have anything to do with systems handling and protected health information.

Who hasn’t heard of “ransomware”? The first reported attack was Hollywood Presbyterian Medical Center which had its EHR and clinical information systems shut down for more than week. The systems were restored after the hospital paid $17,000 in Bitcoins.

Will Data Thieves Also Rob Us of Advances in Healthcare Technology?
Is the data theft at MedStar Health, a major healthcare system in the DC region, a foreboding sign that an industry racing to digitize and interoperate EHRs is facing a new kind of security threat that it is ill-equipped to handle? Hospitals are focused on keeping patient data from falling into the wrong hands, but attacks at MedStar and other hospitals highlight an even more frightening downside of security breaches—as hospitals strive for IT interoperability. Is this goal now a concern?

As hospitals increasingly depend on EHRs and other IT systems to coordinate care, communicate critical health data and avoid medication errors, they could also be risking patients’ well-being when hackers strike. While chasing the latest medical innovations, healthcare facilities are rapidly learning that caring for patients also means protecting their medical records and technology systems against theft and privacy violations.

“We continue the struggle to integrate EHR systems,” says anesthesiologist Dr. Donald M. Voltz, Medical Director of the Main Operating Room at Aultman Hospital in Canton, OH, and an advocate and expert on EHR interoperability. “We can’t allow patient data theft and privacy violations to become an insurmountable problem and curtail the critical technology initiative of resolving health system interoperability. Billions have been pumped into this initiative and it can’t be risked.”

Taking Healthcare Security Seriously
Healthcare is an easy target. Its security systems tend to be less mature than those of other industries, such as finance and tech. Its doctors and nurses depend on data to perform time-sensitive and life-saving work.

Where a financial-services firm might spend a third of its budget on information technology, hospitals spend only about 2% to 3%. Healthcare providers are averaging less than 6% of their information technology budget expenditures on security, according to a recent HIMSS survey. In contrast, the federal government spends 16% of its IT budget on security, while financial and banking institutions spend 12% to 15%.

Meanwhile, the number of healthcare attacks over the last five years has increased 125%, as the industry has become an easy target. Personal health information is 50 times more valuable on the black market than financial information. Stolen patient health records can fetch as much as $363 per record.

“If you’re a hacker… would you go to Fidelity or an underfunded hospital?” says John Halamka, the chief information officer of Beth Israel Deaconess Medical Center in Boston. “You’re going to go where the money is and the safe is the easiest to open.”

Many healthcare executives believe that the healthcare industry is at greater risk of breaches than other industries. Despite these concerns, many organizations have either decreased their cyber security budgets or kept them the same. While the healthcare industry has traditionally spent a small fraction of its budget on cyber defense, it has also not shored up its technical systems against hackers.

Disrupting the Healthcare Security Industry with Behavior Analysis   
Common defenses in trying to keep patient data safe have included firewalls and keeping the organization’s operating systems, software, anti-virus packages and other protective solutions up-to-date.  This task of constantly updating and patching security gaps or holes is ongoing and will invariably be less than 100% functional at any given time.  However, with only about 10% of healthcare organizations not having experienced a data breach, sophisticated hackers are clearly penetrating through these perimeter defenses and winning the healthcare data security war. So it’s time for a disruption.

Many organizations employ network surveillance tactics to prevent the misuse of login credentials. These involve the use of behavior analysis, a technique that the financial industry uses to detect credit card fraud. By adding some leading innovation, behavior analysis can offer C-suite healthcare executives a cutting-edge, game-changing innovation.

The technology relies on the proven power of cloud technology to combine artificial intelligence with machine learning algorithms to create and deploy “digital fingerprints” using ambient cognitive cyber surveillance to cast a net over EHRs and other hospital data sanctuaries. It exposes user behavior deviations while accessing EHRs and other applications with PHI that humans would miss and can not only augment current defenses against outside hackers and malicious insiders, but also flag problem employees who continually violate cyber security policy.

“Hospitals have been hit hard by data theft,” said Doug Brown, CEO, Black Book Research. “It is time for them to consider new IT security initiatives. Harnessing machine learning artificial intelligence is a smart way to sort through large amounts of data. When you unleash that technology collaboration, combined with existing cloud resources, the security parameters you build for detecting user pattern anomalies will be difficult to defeat.”

While the technology is advanced, the concept is simple. A pattern of user behavior is established and any actions that deviate from that behavior, such as logging in from a new location or accessing a part of the system the user normally doesn’t access are flagged.  Depending on the deviation, the user may be required to provide further authentication to continue or may be forbidden from proceeding until a system administrator can investigate the issue.

The cost of this technology will be positively impacted by the continuing decline in the cost of storage and processing power from cloud computing giants such as Amazon Web Services, Microsoft and Alphabet.

The healthcare data security war can be won, but it will require action and commitment from the industry. In addition to allocating adequate human and monetary resources to information security and training employees on best practices, the industry would do well to implement network surveillance that includes behavior analysis. It is the single best technological defense against the misuse of medical facility systems and the most powerful weapon the healthcare industry has in its war against cyber criminals.