Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

Has Amazon Brought Something New To Healthcare Data Analytics?

Posted on November 29, 2018 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Amazon’s announcement that it was getting into healthcare data analytics didn’t come as a major surprise. It was just a matter of time.

After all, the retail giant has been making noises about its health IT ambitions for a while now, and its super-sneaky 1492 team’s healthcare feints have become common knowledge.

Now, news has broken that its massive hosting division, Amazon Web Services, is offering its Comprehend Medical platform to the healthcare world. And at the risk of being a bit too flip, my reaction is “so?” I think we should all take a breath before we look at this in apocalyptic terms.

First, what does Amazon say we’re looking at here?

Like similar products targeting niches like travel booking and supply-chain management, the company reports, Comprehend Medical uses natural language processing and machine learning to pull together relevant information from unstructured text.

Amazon says Comprehend Medical can pull needed information from physician notes, patient health records and clinical trial reports, tapping into data on patient conditions and medication dosage, strength and frequency.

The e-retailer says that users can access the platform through a straightforward API call, accessing Amazon’s machine learning expertise without having to do their own development or train models of their own. Use cases it suggests include medical cohort analysis, clinical decision support and improving medical coding to tighten up revenue cycle management.

Comprehend Medical customers will be charged a fee each month based on the amount of text they process each month, either $0.01 per 100-character unit for the NERe API, which extracts entities, entity relationships, entity traits and PHI, or $0.0014 per unit if they use its PHId API, which only supports identifying PHI for data protection.

All good. All fine. Making machine learning capabilities available in a one-off hosting deal — with a vendor many providers already use — can’t be wrong.

Now, let’s look coldly at what Amazon can realistically deliver.

Make no mistake, I understand why people are excited about this announcement. As with Microsoft, Google, Apple and other top tech influencers, Amazon is potentially in the position to change the way things work in the health IT sector. It has all-star brainpower, the experience with diving into new industries and enough capital to buy a second planet for its headquarters. In other words, it could in theory change the healthcare world.

On the other hand, there’s a reason why even IBM’s Watson Health stumbled when it attempted to solve the data analytics puzzle for oncologist. Remember, we’re talking IBM here, the last bastion of corporate power. Also, bear in mind that other insanely well-capitalized, globally-recognized Silicon Valley firms are still biding their time when it comes to this stuff.

Finally, consider that many researchers think NLP is only just beginning to find its place in healthcare, and an uncertain one at that, and that machine learning models are still in their early stages, and you see where I’m headed.

Bottom line, if Google or Microsoft or Epic or Salesforce or Cerner haven’t been able to pull this off yet, I’m skeptical that Amazon has somehow pole-vaulted to the front of the line when it comes to NLP-based mining of medical text. My guess is that this product launch announcement is genuine, but was really issued more as a stake in the ground. Definitely something I would do if I worked there.

Providers Tell KLAS That Existing EMRs Can’t Handle Genomic Medicine

Posted on November 26, 2018 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Providers are still in the early stages of applying genomics to patient care. However, at least among providers that can afford the investment, clinical genomics programs are beginning to become far more common, and as a result, we’re beginning to get a sense of what’s involved.

Apparently, one of those things might be creating a new IT infrastructure which bypasses the provider’s existing EMR to support genomics data management.

KLAS recently spoke with a number of providers about the vendors and technologies they were using to implement precision medicine. Along the way, they were able to gather some information on the best practices of the providers which can be used to roll out their own programs.

In its report, “Precision Medicine Provider Validations 2018,”  KLAS researchers assert that while precision medicine tools have become increasingly common in oncology settings, they can be useful in many other settings.

Which vendors they should consider depends on what their organization’s precision medicine objectives are, according to one VP interviewed by the research firm. “Organizations need to consider whether they want to target a specific area or expand the solutions holistically,” the VP said. “They [also] need to consider whether they will have transactional relationships with vendors or strategic partnerships.”

Another provider executive suggests that investing in specialty technology might be a good idea. “Precision medicine should really exist outside of EMRs,” one provider president/CEO told KLAS. “We should just use software that comes organically with precision medicine and then integrated with an EMR later.”

At the same time, however, don’t expect any vendor to offer you everything you need for precision medicine, a CMO advised. “We can’t build a one-size-fits-all solution because it becomes reduced to meaninglessness,” the CMO told KLAS. “A hospital CEO thinks about different things than an oncologist.”

Be prepared for a complicated data sharing and standardization process. “We are trying to standardize the genomics data on many different people in our organization so that we can speak a common language and archive data in a common system,” another CMO noted.

At the same time, though, make sure you gather plenty of clinical data with an eye to the future, suggests one clinical researcher. “There are always new drugs and new targets, and if we can’t test patients for them now, we won’t catch things later,” the researcher said.

Finally, and this will be a big surprise, brace yourself for massive data storage demands. “Every year, I have to go back to our IT group and tell them that I need another 400 terabytes,” one LIS manager told the research firm.” When we are starting to deal with 400 terabytes here and 400 terabytes there, we’re looking at potentially petabytes of storage after a very short period of time.”

If you’re like me, the suggestion that providers need to build a separate infrastructure outside the EMR to create precision medicine program is pretty surprising, but it seems to be the consensus that this is the case. Almost three-quarters of providers interviewed by KLAS said they don’t believe that their EMR will have a primary role in the future of precision medicine, with many suggesting that the EMR vendor won’t be viable going forward as a result.

I doubt that this will be an issue in the near term, as the barriers to creating a genomics program are high, especially the capital requirements. However, if I were Epic or Cerner, I’d take this warning seriously. While I doubt that every provider will manage their own genomics program directly, precision medicine will be part of all care at some point and is already having an influence on how a growing number of conditions are treated.

Healthcare Interoperability is a Joke

Posted on November 20, 2018 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Did you see the big news last month about healthcare interoperability? That’s right, Carequality announced support for FHIR. Next thing you know, we’re going to get an announcement that CommonWell is going to support faxing.

Seriously, healthcare interoperability is a joke.

The reality is that no EHR vendor wants to do interoperability. And it’s not saying anything groundbreaking to say that Carequality and CommonWell are both driven by the EHR vendors. Unfortunately, I see these organizations as almost a smokescreen that allows EHR vendors to not be interoperable while allowing them to say that they’re working on interoperability.

I’d describe current interoperability efforts as a “just enough” approach to interoperability. EHR vendors want to do just enough to appease the call for interoperability by the government and other patient organizations. It’s not a real effort to be interoperable. That’s most EHR vendors. A few of them are even using interoperability as a weapon to keep vendors out and some are looking at interoperability as a new business model.

Just to be clear, I’m not necessarily blaming the EHR vendors. They’re doing what their customers are asking them to do which is their highest priority. Until their customers ask for interoperability, it’s not going to happen. And in many respects, their customers don’t want interoperability. That’s been the real problem with interoperability since the start and it’s why grand visions of interoperability are unlikely to happen. Micro interoperability, which is how I’d describe what’s happening today, will happen and is happening.

If EHR vendors really cared about being interoperable, they’d spend the time to see where interoperability would lower costs, improve care, and provide a better patient experience. That turns out to be a lot of places. Then, they’d figure out how to make that possible and still secure and safe. Instead, they don’t really do this. The EHR vendors just follow whatever industry standard is out there so they can say they’re working on interoperability. Ironically, many experts say that the industry standards aren’t standard and won’t really make a big impact on interoperability.

There are no leaders in healthcare interoperability. There are just followers of the “just enough” crowd.

Let’s just be honest about what’s really possible when it comes to EHR vendors and healthcare interoperability. There is some point to point use cases that are really valuable and happening (this feels like what FHIR is doing to me). In a large health system, we’re seeing some progress on interoperability within the organization. We’re starting to see inklings of EHR vendors opening up to third-party providers, but that still has a long ways to go. Otherwise, we’re exchanging CCDs, faxes, and lab results.

Will we see anything more beyond this from EHR vendors? I’m skeptical. Let me know what you think in the comments on on Twitter with @HealthcareScene.

Transition to Value-Based Payments Top Concern for Long-Term and Post-Acute Care

Posted on November 12, 2018 I Written By

Colin Hung is the co-founder of the #hcldr (healthcare leadership) tweetchat one of the most popular and active healthcare social media communities on Twitter. Colin speaks, tweets and blogs regularly about healthcare, technology, marketing and leadership. He is currently an independent marketing consultant working with leading healthIT companies. Colin is a member of #TheWalkingGallery. His Twitter handle is: @Colin_Hung.

On 1 October 2019, CMS will flip from a volume-based reimbursement model for Long-Term & Post-Acute Care Organizations (LTPAC) to a value-based one. This looming transformation was the top concern for most of the 2,000 attendees at PointClickCare’s annual user conference – #PCCSummit18.

PointClickCare makes a cloud-based EHR platform for LTPAC and HomeCare Organizations. According to the company, 60% of all Senior Living and Skilled Nursing Facilities (SNFs) in North America use their platform. Each year, PointClickCare hosts a user conference, called PCCSummit, where customers gather to get a preview of new features and to discuss the industry’s most pressing challenges.

Sweeping LTPAC changes by CMS

The top challenge on the minds of #PCCSummit18 attendees was, by far, the sweeping reimbursement changes being implemented by the Centers for Medicare & Medicaid Services (CMS) on 1 October 2019.  Referred to as the Patient Driven Payment Model (PDPM), it contains three significant changes for SNFs:

  1. A new value-based payment model
  2. Adopting ICD-10
  3. New reporting requirements

“The move to PDPM is going to be a challenge for everyone in the industry,” said Dave Wessinger, COO and Co-Founder of PointClickCare. “I think everyone will agree that moving from a volume-based reimbursement model to a value-based one is ultimately better for healthcare and for patients, but getting there is going to take some work. We are investing millions of dollars in product R&D, implementation resources and training to help make this transition as smooth as possible for our customers – who are all worried about PDPM.”

There were several sessions at #PCCSummit18 dedicated to PDPM. Each session was standing-room only.

A new value-based payment model

PDPM is the first step taken by CMS to shift LTPAC from a volume-based reimbursement model to one that is more value-based.

Currently, SNFs are reimbursed based on the minutes of therapy that patients/residents receive. The daily rate is determined by the type of therapy and SNFs are paid for as long as that therapy is administered. SNFs are required to conduct an assessment at pre-determined intervals to determine if further therapy is needed.

Under PDPM, CMS will base payments to SNFs on patient characteristics (diagnosis and comorbidities) rather than the type and duration of therapy being provided. According to CMS, there are several key advantages of this approach:

  • Removes therapy minutes as the basis for therapy payment (which may have encouraged some SNFs to provide unnecessary therapies to patients)
  • Enhances payment accuracy for nursing services by making nursing payment dependent on a wide range of clinical characteristics
  • Introduces payment adjustments that better reflect changes in resource use over a stay

Under PDPM, each patient/resident will be assigned a case-mix classification that drives the daily reimbursement rate for that individual. This classification is based on the diagnosis, acuity and characteristics of the patient/resident. Unlike the current payment model, the daily reimbursement rate is not uniform. It declines over time. This was done because evidence suggests that most therapies have diminishing returns the longer they are administered – unless the condition of the patient/resident changes.

The following slide illustrates the difference. It was presented by Genice Hornberger, RN, Senior Product Advisor at PointClickCare. The column on the left shows the uniform reimbursement for a 30-day SNF stay under the current RUG-IV payment model. The column in the middle shows what it the daily payments would be like under PDPM with accurate documentation.

Notice how the daily rate under PDPM starts off much higher at almost $915/day vs $631/day. This is in recognition of the work required of SNFs when new patients/residents are admitted.

The right-most column is very interesting. It shows the daily reimbursement rate for the case where patient documentation is inaccurate (ie: missing meds or missing patient conditions). This would result in $1,800 less under PDPM vs the current reimbursement model.

Adopting ICD-10

One of the goals CMS had for PDPM was to “promote consistency with other Medicare and post-acute payment settings by basing resident classification on objective clinical information while minimizing the role of service provision in determination of payment”.

To achieve this goal, CMS is mandating the adoption of ICD-10 standard in LTPAC. This will align long-term and post-acute care with their acute-care counterparts and make it easier for CMS to track Medicare patients moving between different parts of the healthcare system.

During the research and development stage of PDPM, found that almost half of SNF claims assigned generic ICD-9-CM codes as the principal diagnosis for residents in their care, which had limited usefulness in classifying residents. It also made it difficult for CMS to perform detailed analysis of LTPAC data.

Under PDPMD, ICD-10 codes will be used to map residents to the clinical categories that represent the primary reason for SNF care and are also sued for resident classification which determine the reimbursement rate.

New Reporting Requirements

Under the current reimbursement mechanism, SNFs are required to file patient/resident assessments with CMS 5 days, 14 days, 30 days, 60 days and 90 days into the stay. For longer stays, only quarterly assessments need to be filed.

Conducting, documenting and electronically transmitting these assessments requires a lot of time and effort by staff. In consultation with industry leaders, CMS is reducing the reporting requirements under PDPM.

Instead of regularly scheduled assessments, SNFs will now only be required to file a report when a patient/resident is admitted, discharged or has a change in condition. CMS expects to save itself $2B over the next 10 years from this reduction in paperwork and calculates SNFs will save on average, 183 hours per facility per year.

CMS tools to help transition

To help make the transition to PDPM, CMS has made several guides and online tools freely available to SNFs. One very useful tool is a customized analysis of each SNF’s current reimbursement vs future reimbursement under PDPM.

CMS used historic claims data from each SNF and corresponding acute-care data for patients transferred to SNFs (because only the hospital data had the requisite ICD-10 coding to determine the new patient/resident classification under PDPMD) to come up with an estimate of that SNF’s reimbursement under PDPM.

The analysis reveals that most SNFs would be at or above current reimbursement levels. A few therapy-heavy SNFs with non-complex patients will see lower reimbursements.

PointClickCare helping with PDPM transition

Given the importance of PDPM and the worries expressed by its customers, PointClickCare has created additional tools to help in the transition.

The team at PointClickCare smartly realized is the key to PDPM is having accurate documentation of each patient/resident. Any diagnosis, condition change or medication that is not documented will have a negative impact on reimbursements. Sandy Herbert, Senior Director of Product Management at PointClickCare explains:

“There is a hidden gap that could significantly impact reimbursements that we want to make our customers aware of. Through the CMS online tool, they can see an estimation of their reimbursement under PDPM, but baked into that estimation is an assumption of perfect documentation. Everything about the patient/resident needs to be captured and documented properly in the system – if anything is missed it means less money. However, with the change in classification method and the new reporting requirements, SNFs will have to be much more diligent in enforcing good documentation habits in order to maintain their level of reimbursement.”

At #PCCSummit18, the company unveiled an online PDPM assessment tool that calculates what a customer’s PDPM reimbursement would be based on the actual documentation in the system. In most cases, this amount is below the amount the CMS estimate.

In her presentation Hornberger showed an example of how significant this gap can be (see slide above). For a typical 30 day stay, PointClickCare found that certain aspects of the record were not coded properly which would result in a smaller claim being submitted to CMS. Their analysis showed that on average, a SNF would only receive $17,100 for that 30 day stay versus $18,900 under the current system and well below the $19,600 that would be possible under PDPM.

PointClickCare has made their assessment tool – PDPM Risk Assessment – freely available to its customers. Their team of consultants are also working with customers to address the gaps that are identified by the free assessment.

“PointClickCare has a history of working well with clients, especially when it comes to data,” said Timothy Carey, Director of Data and Performance Analytics at BaneCare. “Having the right data available to our leadership is critical. It’s what we need to help improve our processes and workflows. As far as I’m concerned, data from the PointClickCare system is like gold. It shows us where things are going wrong and where we can improve.”

Judging from the smiles on the faces of attendees who got a preview of their customized PDPM Risk Assessment at #PCCSummit18, the data is clearly reducing the anxiety around the transition.

Decommissioning Legacy EHRs

Posted on November 5, 2018 I Written By

The following is a guest blog post by Sudhakar Mohanraj, Founder and CEO, Triyam.

Every product has a lifecycle. The lifecycle of Electronic Health Record (EHR) software begins when it is implemented at your facility and ends when it’s no longer in use. When a facility decides to move to a new EHR, it’s natural to focus planning around the new software system.  However, not considering the legacy EHR can leave you wondering what should happen to all of the historical patient financial and medical data. You have many choices. This article will discuss some of the challenges and options that will influence your cost, legal compliance, and stakeholder satisfaction.

Three common mistakes to avoid when moving to a new EHR

  1. Hanging on to the legacy EHR

Some say: “we will worry about shutting down the old system later after the new EHR is up and going.” Taking that path is risky and expensive.

Consider the cost. Until you get all your historical data off the legacy system, you need to pay vendors licensing and support fees. You may infrequently be using the old system, which makes these fees particularly unwarranted.  In addition, you continue to pay your employees to operate and maintain the old system.

To learn more about retiring Legacy EHRs register for this free live webinar. Industry experts will share Key lessons and Best Practices on data management strategies for EHR system replacements. You can also get answers to your questions about any specific requirements.

Some say, “I will stop paying my old vendor.  I don’t need any more updates or support.” However, sooner or later, hardware and software will break or become incompatible to newer technology. Older systems are an easy target for hackers and thieves.

Over time, your employees will forget passwords, how to navigate the old system or leave for other jobs. Then, when you, a patient, or your boss needs some report from the old system, you are caught short. Over time, data retained on an old, unsupported, infrequently used system increases the risk of being lost, stolen, corrupted, and not accessible by newer technology.

Bottom line: keeping an old, infrequently used system will needlessly eat up your time and money.

  1. Migrating all historical data from the legacy system to the new EHR

Some facilities are surprised to learn that the new EHR vendor will not convert all the historical data to the new computer system.

The new system is organized differently than the legacy system with different data elements and structures. There is never a one-to-one match on data mapping between the old and new systems.

It is difficult to validate the accuracy and completeness of data you want to import from the old system. The new EHR vendor doesn’t want to risk starting with an inaccurate database.

This is a golden opportunity to start with a clean slate. For example, you can take this time to reorganize, re-categorize, re-word codes, and tables. Now is the time to set up master files properly, and to make the system more efficient.

The new EHR vendor will lobby for you to start with a clean slate and populate the new database with only current patients, current balances, and current information.

  1. Ignoring Legal Compliance Requirements

Federal and state laws require healthcare facilities to retain medical and financial reports for 5 to 15 years and make these reports available to patients and others upon request. Keeping these records will help to avoid penalties, fines, and loss of certifications. Consult your compliance office, accountant, and HIPAA director to know Federal, IRS, and state-specific requirements.

Use this Data retention tool to find the retention requirements for your state.

Why data archival is an excellent choice

What are the best practices to deal with historical data? Data from the old system needs to be organized in a safe, secure place so that the information can be found and made readily available to those who need it in a timely fashion. In other words, it needs to be archived.

An archive is a separate system from your new EHR. It contains all your historical data and reports. When users sign into the archive program, depending on their user rights, they may see all or some of the historical reports. The most common functions of the archive system include:

  • Search and query clinical and financial data for “Continuity of Care.
  • Download, view, print, and share reports for “Release of Information.

Archival is a new concept. KLAS research is creating a new product category for this.  Listen to this on-demand webinar from the head of EHR Archive studies at KLAS Research.

In the archive, you can see all patients and their previous charts, medications, treatments, billings, insurance claims, payments, and more.  You will also see the historical vendor, employee, and accounting records.

What type of data goes to the archive? All sorts. You can retain discrete data or non-discrete data, structured data (like SQL, XML, CCDA), or unstructured data that is logically grouped and presented in a human-readable form like pdf reports, Excel spreadsheets, CCD, jpeg, or mp3 files.

Mergers and data consolidation

Archival is essential even when there isn’t a transition to new EHR. During a merger, the new entity frequently wants to consolidate patient financial and clinical data from multiple legacy systems into a common platform. Data archiving may be the best solution for dealing with multiple EMR/EHRs. Archival is less expensive than complex conversion and transformation efforts. Besides lower costs, it allows users to research on consolidated data using business intelligence and analytics tools running on one common unified database.

Outsourcing and vendor selection

Outsourcing has become an increasingly popular option for archival solutions for three reasons – cost, experience, and convenience. IT managers are already stretched to limits of time, resources, and budget.  Outside vendors can save the day by offering services for less cost.

When searching for an archival vendor, consider the following:

  • Experience in extracting data from your legacy systems which are no longer supported
  • Complete turnkey solutions – planning, pilot testing, data conversion, user acceptance, and decommissioning
  • Archival product features and ease of use
  • Great customer references
  • Cost of archiving should only be a fraction of the cost of retaining legacy system

The number one failure when implementing a new EHR is procrastinating the archival of legacy data. Hopefully, you can use a few of these ideas to maximize the benefits of your historical data, minimize costs, and best serve your user constituents.

About Triyam
Triyam delivers expert solutions in EMR / EHR Data Conversion and Archival.

Triyam’s data conversion services help hospitals and clinics to freely migrate from one EHR vendor to another without losing any historical patient data. Triyam’s EHR archival product, Fovea is a vendor neutral, innovative and intuitive platform to store all your legacy data. Fovea includes a powerful search engine and extensive reporting for Business Intelligence and Analytics. Triyam is a proud sponsor of Healthcare Scene.

Scripps Research Translational Institute Partners To Develop AI Applications

Posted on November 2, 2018 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

The Scripps Research Translational Institute has agreed to work with graphics processing unit-maker NVIDIA to support the development of AI applications. The partners plan to forge AI and deep learning best practices, tools and infrastructure tailored to supporting the AI application development process.

In collaboration with NVIDIA, Scripps will establish a center of excellence for artificial intelligence in genomics and digital sensors. According to Dr. Eric Topol, the Institute’s founder and director, AI should eventually improve accuracy, efficiency, and workflow in medical practices. This is especially true of the data inputs from sensors and sequencing, he said in an NVIDIA blog item on the subject.

Scripps is already a member of a unique data-driven effort known as the “All of Us Research Program,” which is led by the National Institutes of Health. This program, which collects data on more than 1 million US participants, looks at the intersection of biology, genetics, environment, data science, and computation. If successful, this research will expand the range of conditions that can be treated using precision medicine techniques.

NVIDIA, for its part, is positioned to play an important part in the initial wave of AI application rollouts. The company is a leader in producing performance chipsets popular with those who play high-end, processor-intensive gaming which it has recently applied to other processor intensive projects like blockchain. It now hopes its technology will form the core of systems designed to crunch the high volumes of data used in AI projects.

If NVIDIA can provide hardware that makes high-volume number-crunching less expensive and more efficient, it could establish an early lead in what is likely to be a very lucrative market. Given its focus on graphics processing, the hardware giant could be especially well-suited to dominate rapidly-emerging radiology AI applications.

We can certainly expect to see more partnerships like this file into place over the next year or two. Few if any IT vendors have enough scientific expertise in-house to make important gains in biotech AI, and few providers have enough excess IT talent available to leverage discoveries and data in this arena.

It will be interesting to see what AI applications development approaches emerge from such partnerships. Right now, much AI development and integration is being done on a one-off basis, but it’s likely these projects will become more systematized soon.

Software Marks Advances at the Connected Health Conference (Part 1 of 2)

Posted on October 29, 2018 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

The precepts of connected health were laid out years ago, and merely get updated with nuances and technological advances at each year’s Connected Health conference. The ideal of connected health combines matching the insights of analytics with the real-life concerns of patients; monitoring people in everyday settings through devices that communicate back to clinicians and other caregivers; and using automation to free up doctors to better carry out human contact. Pilots and deployments are being carried out successfully in scattered places, while in others connected health languishes while waiting for the slow adoption of value-based payments.

Because I have written at length about the Connected Health conference in 2015, 2016, and 2017, I will focus this article on recent trends I ran into at this year’s conference. Key themes include precertification at the FDA, the state of interoperability (which is poor), and patient engagement.

Exhibition floor at Connected Health conference

Exhibition floor at Connected Health conference

Precertification: the status of streamlining approval for medical software

One of the ongoing challenges in the progress of patient involvement and connected health is the approval of software for diagnosis and treatment. Traditionally, the FDA regulated software and hardware together in all devices used in medicine, requiring rigorous demonstrations of safety and efficacy in a manner similar to drugs. This was reasonable until recently, because anything that the doctor gives to the patient needs to be carefully checked. Otherwise, insurers can waste a lot of money on treatments that don’t work, and patients can even be harmed.

But more and more software is offered on generic computers or mobile devices, not specialized medical equipment. And the techniques used to develop the software inherit the “move fast and break things” mentality notoriously popular in Silicon Valley. (The phrase was supposedly a Facebook company motto.) Software can be updated several times a day. Although A/B testing (an interesting parallel to randomized controlled trials) might be employed to see what is popular with users, quality control is done in completely different ways. Modern software tends to rely for safety and quality on unit tests (which make sure individual features work as expected), regression tests (which look for things that no longer work they way they should), continuous integration (which forces testing to run each time a change is submitted to the central repository), and a battery of other techniques that bear such names as static testing, dynamic testing, and fuzz testing. Security testing is yet another source of reliability, using techniques such as penetration testing that may be automated or manual. (Medical devices, which are notoriously insecure, might benefit from an updated development model.

The FDA has realized that reliable software can be developed within the Silicon Valley model, so long as rigor and integrity are respected. Thus, it has started a Pre-Cert Pilot Program that works with nine brave vendors to find guidelines the FDA can apply in the future to other software developers.

Representatives of four vendors reported at the Connected Health conference that the pilot is going quite well, with none of the contentious and adversarial atmosphere that characterizes the interactions between the FDA with most device manufacturers. Every step of the software process is available for discussion and checking, and the inquiries go quite deep. All participants are acutely aware of the risk–cited by critics of the program–that it will end up giving vendors too much leeway and leaving the public open to risks. The participants are committed to closing loopholes and making sure everyone can trust the resulting guidelines.

The critical importance of open source software became clear in the report of the single open source vendor who is participating in the pilot: Tidepool. Because it is open source, according to CEO Howard Look, Tidepool was willing to show its code as well as its software development practices to independent experts using multiple evaluation assessment methods, including a “peer appraisal” by fellow precert participants Verily and Pear Therapeutics. One other test appraisal (CMMI, using external auditors) was done by both Tidepool and Johnson & Johnson; no other participants did a test appraisal. Thus, if the FDA comes out with new guidelines that stimulate a tremendous development of new software for medical use, we can thank open source.

Making devices first-class players in health care

Several exhibitors at the conference were consulting firms who provide specific services to start-ups and other vendors trying to bring products to market. I asked a couple of these consultants what they saw as the major problems their clients face. Marcus Fontaine, president of Impresiv Health, said their biggest problem is the availability of data, particularly because of a lack of interoperable data exchange. I wanted to exclaim, “Still?”

Joseph Kvedar, MD, who chairs the Connected Health conference, spoke of a new mobile app developed by his organization, Partners Connected Health, to bring device data into their EHR. This greatly improves the collection of data and guarantees accuracy, because patients no longer have to manually enter vital signs or other information. In addition to serving Partners in improving patient care, the data can be used for research and public health. In developing this app, Partners depended heavily for interoperable data exchange on work by Validic, the most prominent company in the device interoperability space, and one that I have profiled and whose evolution I have followed.

Ideally, each device could communicate directly with the EHR. Why would Partners Connected Health invest heavily in creating a special app as an intermediary? Kvedar cited several reasons. First, each device currently offers its own app as a user interface, and users with multiple devices get confused and annoyed by the proliferation of apps. Second, many devices are not designed to communicate cleanly with EHRs. Finally, the way networks are set up, communicating would require a separate cellular connection and SIM card for each device, raising costs.

A similar effort is pursued by Indie Health, trying to solve the problem of data access by making it easy to create Bluetooth connections between devices and mobile phones using a variety of Bluetooth, IEEE, Continua, and other standards.

The CEO of Validic, Drew Schiller, spoke on another panel about maximizing the value of patient-generated data. He pointed out that Validic, as an intermediary for a huge number of devices and health care providers, possesses a correspondingly huge data set on how patients are using the devices, and in particular when they stop using the devices. I assume that Validic does not preserve the data generated by the devices, such as blood pressure or steps taken–at least, Schiller did not say they have that data, and it would be intrusive to collect it. However, the metadata they do collect can be very useful in designing interactions with patients. He also talked about the value of what he dubs “invisible health care,” where behavior change and other constructive uses of data can flow easily from the data.

Barry Reinhold, president and CTO of Lamprey Networks, was manning the Continua booth when I came by. Continua defines standard for devices used in the home, in nursing faciliies, and in other places outside the hospital. This effort should be open source, supported by fees by all affected stakeholders (hospitals, device manufacturers, etc.). But open source is spurned by the health care field, so Continua does the work as a private company. Reinhold told me that device manufacturers rarely contract with Continua, which I treat as a sign that device manufacturers value data silos as a business model. Instead, Continua contracts come from the institutions that desperately need access to the data, such as nursing facilities. Continua does the best it can to exploit existing standards, including the “continuing data” profile from FHIR.

Other speakers at the conference, including Andrew Hayek, CEO of OptumHealth, confirmed Reinhold’s observation that interoperability still lags among devices and EHRs. And Schiller of Validic admitted that in order to get data from some devices into a health system, the patient has to take a photo of the device’s screen. Validic not only developed an app to process the photo, but patented it–a somewhat odd indication that they consider it a major contribution to health care.

Tasha van Es and Claire Huber of Redox, a company focused on healthcare interoperability and data integration, said that they are eager to work with FHIR, and that it’s a major part of their platform, but they think it has to develop more before being ready for widespread use. This made me worry about recent calls by health IT specialists for the ONC, CMS, and FDA to make FHIR a requirement.

It was a pleasure to reconnect at the conference with goinvo, which creates open source health care software on a contract basis, but offers much of it under a free license.

A non-profit named Xcertia also works on standards in health care. Backed by the American Medical Association, American Heart Association, DHX Group, and HIMSS, they focus on security, privacy, and usability. Although they don’t take on certification, they design their written standards so that other organizations can offer certification, and a law considered in California would mandate the use of their standards. The guidelines have just been released for public comment.

The second section of this article covers patient engagement and other topics of interest that turned up at the conference.

Will UnitedHealth’s New Personal Health Record Make An Impact?

Posted on October 26, 2018 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Though the idea of a personal health record was a hot thing for a while, it didn’t become the fixture of the healthcare market that pundits had predicted. In fact, as many readers will recall, even deep pockets like Google and Microsoft couldn’t get their users to sign on to their PHRs en masse.

One of the main reasons the PHR model didn’t take is that people simply didn’t want to use them. In fact, at least at the time, the PHR was almost entirely a solution in search of a problem. After all, if a health data power user and patient advocate like myself didn’t want one, what hope did PHR backers have of interesting your average Joe Blow in aggregating their health data online?

Over time, however, the personal health data landscape has changed, with patient records becoming a bit more portable. While consumers still aren’t beating down the doors to get their own PHR, those who are interested in pulling together their medical records electronically have better access to their history.

Not only that, wearables makers like Apple and Fitbit are sweetening the pot, primarily by helping people pull self-generated data into their health record. Arguably, patient-generated data may not be as valuable as traditional records just yet, but consumers are likely to find it more interesting than the jargon-laden text found in provider records.

Given recent developments like these, I wasn’t entirely surprised to learn that UnitedHealth Group is picking up the PHR torch. According to an article in MedCity News, the giant payer plans to launch what sounds like an updated PHR platform next year to its 50 million benefited plan members.

Apparently, on an earnings call last week UnitedHealth CEO Dave Wichmann said that the company will launch a “fully integrated and fully portable individual health record” in 2019. Notably, this is not just a data repository, but rather an interactive tool that “delivers personalized next-best health actions to people and their caregivers.”

The new health record will be based on UnitedHealth’s Rally health and wellness platform, which the insurer picked up when it acquired Audax Health in 2014. The platform, which has 20 million registered users, works to influence members to perform healthy behaviors in exchange for the incentive dollars,

Over time, Wichmann said, UHG intends to build Rally into a platform which collects and distributes “deeply personalized” health information to individual members, MedCity reported. The idea behind this effort is to highlight gaps in care and help patients assess the care that they get.  Wichmann told earnings call listeners that the platform data will be packaged and presented to clinicians in a form similar to that used by existing EHRs.

UHG’s plans here are certainly worth keeping an eye on over the next year or two. I have no doubt that the nation’s largest commercial payer has some idea of how to format data and make it digestible by systems like Cerner and Epic.

But while patients have become a bit more familiar with the benefits of having their health data on hand, we’re not exactly seeing consumers stampede the providers demanding their own health record either, and I’m far from convinced that this effort will win new converts.

My skepticism comes partly from first-hand experience. As a recent UnitedHealth beneficiary, I’ve used the Rally application, and I didn’t find it all that motivating. Honestly, I doubt any online platform will make much of an impact on patient health on its own, as the reasons for many health issues are multifactorial and can’t be resolved by handing one of us a few Rally bucks.

Personal gripes aside, though, the bigger question remains whether consumers think they’ll get something valuable out of using the new UHG tool. As always, you can never count on them coming just because you built it.

AMA Releases Great Guide To Digital Health Implementation

Posted on October 25, 2018 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

In the past, I’ve been pretty hard on the AMA when it comes to digital health. Last year I gave the organization a particularly hard time when it rolled out its Physician Innovation Network platform, which is designed to help physicians network directly with health tech firms, as it seemed to be breaking little to no ground.

However, to be fair the AMA has been a relatively quiet but solid presence in health IT for quite some time.  Its health IT efforts include cofounding Health2047, which brings together doctors with established health IT companies to help the companies launch services and products, serving as one of four organizations behind mHealth app standards venture Xcertia and managing a student-run biotechnology incubator in collaboration with Sling Health.

But what it hasn’t done so far, at least to date, has been to offer physicians any hands-on guidance on using emerging health IT. Now, at long last, the AMA has taken the plunge, releasing a guide focused on helping physicians roll out digital health technology in their practice. At least this time around, I have to give the organization a high five.

The new guide takes a lifecycle perspective, helping practices work through the digital health implementation process from preparations to rollout to gathering data on the impact of the new technology. In other words, it lays out the process as a feedback loop rather than a discrete event in time, which is smart. And its approach to explaining each step is concise and clean.

One section identifies six straightforward steps for choosing a digital health technology, including identifying a need, defining success early on in the process, making the case for political and financial buy-in, forming the team, evaluating the vendor and executing the vendor contract.

Along the way, it makes the important but often-neglected point that the search should begin by looking at the practice’s challenges, including inefficiencies, staff pain points or patient health and satisfaction problems. “The focus on need will help you avoid the temptation to experiment with new technologies that ultimately will result in tangible improvements,” the guide notes.

Another offers advice on tackling more immediate implementation issues, including steps like designing workflows, preparing the care team and partnering with the patient. This section of the report differs from many of its peers by offering great advice on building workflow around remote patient monitoring-specific requirements, including handling device management, overseeing patient enrollment and interactions, and assuring that coding and billing for remote patient management activities is correct and properly documented.

The guide also walks practices through the stages of final implementation, including the nature of the rollout itself, evaluating the success of the project and scaling up as appropriate. I was particularly impressed by its section on scaling up, given that most of the advice one sees on this subject is generally aimed at giant enterprises rather than typically smaller medical practices. In other words, it’s not that the section said anything astonishing, but rather that it existed at all.

All told, it’s great to see the AMA flexing some of the knowledge it’s always had, particularly given that the report is available at no cost to anyone. Let’s hope to see more of this in the future.

Rolling Over Mountains – An Interview with Niko Skievaski, President of Redox

Posted on October 16, 2018 I Written By

Colin Hung is the co-founder of the #hcldr (healthcare leadership) tweetchat one of the most popular and active healthcare social media communities on Twitter. Colin speaks, tweets and blogs regularly about healthcare, technology, marketing and leadership. He is currently an independent marketing consultant working with leading healthIT companies. Colin is a member of #TheWalkingGallery. His Twitter handle is: @Colin_Hung.

Over the past year I have been following the success of Redox and I have read many articles about the entrepreneurial journey of their President and Co-Founder, Niko Skievaski. I recently had the chance to sit down with him at the MGMA18 conference in Boston.

Rather than revisit the same questions that have been covered in dozens of other articles, I wanted to go in a different direction. I wanted to learn more about Skievaski- the-person rather than Skievaski-the-entrepreneur and I wanted to hear Skievaski’s opinion on the state of the healthcare as an ecosystem.

The latter is something that we have been investigating here at Healthcare Scene. For more details, see John Lynn’s recent post about MEDITECH’s app development environment (Greenfield) and my article exploring whether EHR companies are difficult to work with.

Skievaski and I had a wide-ranging conversation. I hope you enjoy it.

You and I met briefly at the Redox party at HIMSS18 earlier this year. I just want to thank you for your hospitality.

You’re welcome. We love our taco parties at Redox. I’m glad you enjoyed the fiesta.

I understand that you recently moved from Madison, WI to Boulder, Colorado. Why the move?

I lived in Madison for 10 years. I was working for EPIC during that time so it made sense to be there. But I recently decided that I needed a few more mountains in my life so I moved to Boulder.

All through college I raced mountain bikes and I wanted to get back to that. Madison does have a few rolling hills which are fun to ride down, but there’s no comparison to biking down a mountain. So I moved to Boulder for the mountain biking.

You’re from Canada right? [Yes] I was up in British Columbia for two months in the summer last year just mountain biking the trails up there. That was my first real experience being in Canada for an extended period of time. It was fun. You guys are really chill up there in Vancouver.

There are many players in the data integration space. Some have been in the business for decades. Why has Redox succeed in capturing the buzz while others haven’t?

We do things fundamentally differently than existing vendors in the integration space.

In the status quo, you implement an EHR and you need upwards of 400 interfaces to connect it to various other systems in your hospital. So you go out and hire 5-20 interface analysts to sit around all day and code the interfaces you need. You do that a few times, like we did at Epic, and you realize that you are building the same interface over and over again for different health systems. It is literally is the same interface.

Redox is based on the premise that you only should have to build the interface once for all healthcare systems. Once it’s built, others can leverage that work too. For example, we connect Brigham and Women’s ADT feed to Redox. We mapped it. We know where all the fields are. And we’ve done the same with hundreds of other health systems. So if there is any reason that Brigham wants to share their info with any of those other health systems we can facilitate it very easily.

Legacy players didn’t grow up in the cloud so they don’t think like we do. They come from a world of on-premise integration and at a time when healthcare organizations wanted to do all the interface work themselves. It’s a different world now.

I guess you can say that we’re getting the attention because we are solving the problem so differently than everyone else.

One of the interesting things about Redox is that you don’t sell to healthcare organizations. Instead you focus exclusively on HealthIT vendors. Why is that?

We started by working with HealthIT startups that knew how to build in the cloud but didn’t know anything about HL7 and didn’t want to. Yet these companies needed to connect to their customers’ EHR systems.

Without that integration, healthcare organizations wouldn’t buy these amazing cloud apps because of the lack of easy connectivity to their existing systems. In that equation, the incentive lies with the HealthIT company. They are the ones that want to solve the issue of connectivity more than the healthcare organization does. So we target companies that need this help and we go to their customers, get connected to the data and make It easy for the new company to focus on what they do best – which isn’t data integration.

The first project we do with a health system is very much like a standard integration project. The second project is where things get excited because we use that exact same interface we built the first time. There’s really no work to be done by the organization. That’s how we scale.

Is there an ideal type of HealthIT company that Redox likes to work with?

With certain vendors who have the right multi-tenant architecture, like PointClickCare, we can just connect with them once and they can then provision to their customers with a flip of a switch. Any PointClickCare location that wants integration, they can just click and make it happen. Together we make it very easy for a PointClickCare customer to connect with HIEs and the healthcare organizations that they work with.

Basically any HealthIT vendor that is truly cloud-based and that has embraced the concept of having a single platform for everyone is an ideal fit for Redox. Of course, we’re willing to talk to anyone to try and find a solution, but if you are cloud-based HealthIT vendor we should really be talking.

Can you give me an example of an advantage Redox enjoys because you are cloud-based?

By being in the cloud we essentially become the cloud interface for health systems to connect to cloud apps. Vendors come to us because we make it easy for them to get the data they need. Healthcare organizations push cloud vendors they want to work with to us because they won’t have to do any work to connect that new app if that vendor signs on with Redox.

Where things get really interesting, and exciting for Redox, is when we can use our cloud platform to facilitate conversations between vendors and their common customers without the need to go all the way back to that customer’s EHR as the focal point of integration.

For example, say there is a cloud-based scheduling app that allows patients to see and book appointments online. Let’s say they are a Redox customer. Now let’s say there is a telemedicine app that allows healthcare organizations to offer telehealth visits and it reads/writes appointment data directly into the organization’s EHR. Say this telemedicine company is a Redox customer too. So if the healthcare org wants to offer Telemedicine appointments through that scheduling app, the two companies can just integrate through Redox rather than use the EHR as the point of integration because we have all the necessary information running through our platform. This would speed up the transaction and make the patient experience more seamless.

This level of integration is just not possible without being in the cloud.

One of the topics we have explored recently at Healthcare Scene is how difficult it is (or isn’t) to work with EHR companies like Epic, Cerner and Allscripts. What are your thoughts on this? Are EHR companies hard to work with?

I would say, in general, EHR companies get a bad rap. I worked at Epic and I have to say that being inside Epic you don’t realize that people outside think you are difficult to work with. We worked hard to give our customers good service. Epic supports their customers, which are health systems. If a system wants to integrate with an application, then Epic people are more than happy to make it happen. They will put together a project team to support that initiative.

I think that as long as the health system is driving the conversation, EHR companies can be easy to work with.

The challenging part is when there is no customer in between. Say you are a HealthIT vendor and you want to go strike up a deal with an EHR company, like Epic. You have to realize that it’s nearly impossible for that EHR company to assess you as HealthIT vendor. They can’t tell if you are a good vendor or a bad one. If you are an established player or someone with an idea on the back of a napkin. The only way they can tell is if they go ask their customers – the health systems. Because of this, their traditional response has been: “Yes, happy to work with you, but we need to have one of our customers on board to prove this will work.” This can be perceived as being difficult to work with.

When we started Redox we didn’t go immediately knocking on Epic’s door and asking our friends to partner with us. Instead we went out and found a mutual customer to work with so that we would have a proof point when we did approach them.

I actually think it is easier to work with large EHR companies versus smaller ones. The larger companies have more invested in each of their customers and are more apt to work on projects that their customers want to do. Smaller EHR companies are constrained by resources and often don’t have the infrastructure to support integration projects in a timely manner. The good news is that things are changing. We’re seeing a lot more of the small EHR companies come out with developer programs, APIs and partner exchanges. I think they understand the need for their systems to be open.

Is the lack of interoperability a technological issue or is it simply an unwillingness to collaborate?

Neither. It’s a business model problem.

There is no business model that drives healthcare organizations to share their data. No one bats an eye about the lack of interoperability in the consumer world. Walmart doesn’t share their customer data with Target even though there are many people buy from both retailers. If they did share data, they would just be stealing each other’s customers. Healthcare organizations are in competition with each other so they aren’t really incentivized to share data with each other, but give them a useful app in between and all of a sudden they will open up their data.

Interoperability is the right thing to do, but it’s a hard thing to do.

What do you wish you could do with an EHR company that you cannot do today?

The user interface (UI) of EHRs are locked down. I wish EHR companies were more open to change workflow or add buttons to their UIs to make things a more seamless.

I totally understand why they don’t allow it. The workflow in an EHR has an impact on patient safety as well as on outcomes, so you wouldn’t want just any vendor to be able to make UI changes on a whim. But it would be great if there was a way to do something with the UI to make it easier for the end user.

For example, if you are doing something in the workflow, it would be fantastic if you could add a button to the UI that launched a 3rd party app from within the EHR. Say a clinician is doing a chart review and they want to be able to see the latest data from a remote patient monitoring tool. Imagine if that clinician could click a button and launch the actual monitoring app rather than that app having to ship its data to the EHR and have it stored/rendered in a poor format – like a table of numbers or a rudimentary chart. Why not let the native app show the data in all it’s glory using an interface designed specifically for it?

What’s next for Redox?

We want to push the healthcare industry to a point where we don’t even think about integration anymore. We want to see an end to integration projects. Think about all the time and resources that would be saved if you don’t have to use a custom interface each time. If we can do that we can drive down the cost of healthcare for everyone. To do that we just have to keep growing the nodes on our network and be a good partner to everyone.

 

This may sound like a tall order, but maybe not for someone who rolls over mountains on a bike for fun.

[Update: Niko Skievaski’s title which was incorrectly reported as CEO. Skievaski is Redox’s President and Co-Founder]