Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

Has Amazon Brought Something New To Healthcare Data Analytics?

Posted on November 29, 2018 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Amazon’s announcement that it was getting into healthcare data analytics didn’t come as a major surprise. It was just a matter of time.

After all, the retail giant has been making noises about its health IT ambitions for a while now, and its super-sneaky 1492 team’s healthcare feints have become common knowledge.

Now, news has broken that its massive hosting division, Amazon Web Services, is offering its Comprehend Medical platform to the healthcare world. And at the risk of being a bit too flip, my reaction is “so?” I think we should all take a breath before we look at this in apocalyptic terms.

First, what does Amazon say we’re looking at here?

Like similar products targeting niches like travel booking and supply-chain management, the company reports, Comprehend Medical uses natural language processing and machine learning to pull together relevant information from unstructured text.

Amazon says Comprehend Medical can pull needed information from physician notes, patient health records and clinical trial reports, tapping into data on patient conditions and medication dosage, strength and frequency.

The e-retailer says that users can access the platform through a straightforward API call, accessing Amazon’s machine learning expertise without having to do their own development or train models of their own. Use cases it suggests include medical cohort analysis, clinical decision support and improving medical coding to tighten up revenue cycle management.

Comprehend Medical customers will be charged a fee each month based on the amount of text they process each month, either $0.01 per 100-character unit for the NERe API, which extracts entities, entity relationships, entity traits and PHI, or $0.0014 per unit if they use its PHId API, which only supports identifying PHI for data protection.

All good. All fine. Making machine learning capabilities available in a one-off hosting deal — with a vendor many providers already use — can’t be wrong.

Now, let’s look coldly at what Amazon can realistically deliver.

Make no mistake, I understand why people are excited about this announcement. As with Microsoft, Google, Apple and other top tech influencers, Amazon is potentially in the position to change the way things work in the health IT sector. It has all-star brainpower, the experience with diving into new industries and enough capital to buy a second planet for its headquarters. In other words, it could in theory change the healthcare world.

On the other hand, there’s a reason why even IBM’s Watson Health stumbled when it attempted to solve the data analytics puzzle for oncologist. Remember, we’re talking IBM here, the last bastion of corporate power. Also, bear in mind that other insanely well-capitalized, globally-recognized Silicon Valley firms are still biding their time when it comes to this stuff.

Finally, consider that many researchers think NLP is only just beginning to find its place in healthcare, and an uncertain one at that, and that machine learning models are still in their early stages, and you see where I’m headed.

Bottom line, if Google or Microsoft or Epic or Salesforce or Cerner haven’t been able to pull this off yet, I’m skeptical that Amazon has somehow pole-vaulted to the front of the line when it comes to NLP-based mining of medical text. My guess is that this product launch announcement is genuine, but was really issued more as a stake in the ground. Definitely something I would do if I worked there.

Providers Tell KLAS That Existing EMRs Can’t Handle Genomic Medicine

Posted on November 26, 2018 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Providers are still in the early stages of applying genomics to patient care. However, at least among providers that can afford the investment, clinical genomics programs are beginning to become far more common, and as a result, we’re beginning to get a sense of what’s involved.

Apparently, one of those things might be creating a new IT infrastructure which bypasses the provider’s existing EMR to support genomics data management.

KLAS recently spoke with a number of providers about the vendors and technologies they were using to implement precision medicine. Along the way, they were able to gather some information on the best practices of the providers which can be used to roll out their own programs.

In its report, “Precision Medicine Provider Validations 2018,”  KLAS researchers assert that while precision medicine tools have become increasingly common in oncology settings, they can be useful in many other settings.

Which vendors they should consider depends on what their organization’s precision medicine objectives are, according to one VP interviewed by the research firm. “Organizations need to consider whether they want to target a specific area or expand the solutions holistically,” the VP said. “They [also] need to consider whether they will have transactional relationships with vendors or strategic partnerships.”

Another provider executive suggests that investing in specialty technology might be a good idea. “Precision medicine should really exist outside of EMRs,” one provider president/CEO told KLAS. “We should just use software that comes organically with precision medicine and then integrated with an EMR later.”

At the same time, however, don’t expect any vendor to offer you everything you need for precision medicine, a CMO advised. “We can’t build a one-size-fits-all solution because it becomes reduced to meaninglessness,” the CMO told KLAS. “A hospital CEO thinks about different things than an oncologist.”

Be prepared for a complicated data sharing and standardization process. “We are trying to standardize the genomics data on many different people in our organization so that we can speak a common language and archive data in a common system,” another CMO noted.

At the same time, though, make sure you gather plenty of clinical data with an eye to the future, suggests one clinical researcher. “There are always new drugs and new targets, and if we can’t test patients for them now, we won’t catch things later,” the researcher said.

Finally, and this will be a big surprise, brace yourself for massive data storage demands. “Every year, I have to go back to our IT group and tell them that I need another 400 terabytes,” one LIS manager told the research firm.” When we are starting to deal with 400 terabytes here and 400 terabytes there, we’re looking at potentially petabytes of storage after a very short period of time.”

If you’re like me, the suggestion that providers need to build a separate infrastructure outside the EMR to create precision medicine program is pretty surprising, but it seems to be the consensus that this is the case. Almost three-quarters of providers interviewed by KLAS said they don’t believe that their EMR will have a primary role in the future of precision medicine, with many suggesting that the EMR vendor won’t be viable going forward as a result.

I doubt that this will be an issue in the near term, as the barriers to creating a genomics program are high, especially the capital requirements. However, if I were Epic or Cerner, I’d take this warning seriously. While I doubt that every provider will manage their own genomics program directly, precision medicine will be part of all care at some point and is already having an influence on how a growing number of conditions are treated.

Healthcare Interoperability is a Joke

Posted on November 20, 2018 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Did you see the big news last month about healthcare interoperability? That’s right, Carequality announced support for FHIR. Next thing you know, we’re going to get an announcement that CommonWell is going to support faxing.

Seriously, healthcare interoperability is a joke.

The reality is that no EHR vendor wants to do interoperability. And it’s not saying anything groundbreaking to say that Carequality and CommonWell are both driven by the EHR vendors. Unfortunately, I see these organizations as almost a smokescreen that allows EHR vendors to not be interoperable while allowing them to say that they’re working on interoperability.

I’d describe current interoperability efforts as a “just enough” approach to interoperability. EHR vendors want to do just enough to appease the call for interoperability by the government and other patient organizations. It’s not a real effort to be interoperable. That’s most EHR vendors. A few of them are even using interoperability as a weapon to keep vendors out and some are looking at interoperability as a new business model.

Just to be clear, I’m not necessarily blaming the EHR vendors. They’re doing what their customers are asking them to do which is their highest priority. Until their customers ask for interoperability, it’s not going to happen. And in many respects, their customers don’t want interoperability. That’s been the real problem with interoperability since the start and it’s why grand visions of interoperability are unlikely to happen. Micro interoperability, which is how I’d describe what’s happening today, will happen and is happening.

If EHR vendors really cared about being interoperable, they’d spend the time to see where interoperability would lower costs, improve care, and provide a better patient experience. That turns out to be a lot of places. Then, they’d figure out how to make that possible and still secure and safe. Instead, they don’t really do this. The EHR vendors just follow whatever industry standard is out there so they can say they’re working on interoperability. Ironically, many experts say that the industry standards aren’t standard and won’t really make a big impact on interoperability.

There are no leaders in healthcare interoperability. There are just followers of the “just enough” crowd.

Let’s just be honest about what’s really possible when it comes to EHR vendors and healthcare interoperability. There is some point to point use cases that are really valuable and happening (this feels like what FHIR is doing to me). In a large health system, we’re seeing some progress on interoperability within the organization. We’re starting to see inklings of EHR vendors opening up to third-party providers, but that still has a long ways to go. Otherwise, we’re exchanging CCDs, faxes, and lab results.

Will we see anything more beyond this from EHR vendors? I’m skeptical. Let me know what you think in the comments on on Twitter with @HealthcareScene.

Decommissioning Legacy EHRs

Posted on November 5, 2018 I Written By

The following is a guest blog post by Sudhakar Mohanraj, Founder and CEO, Triyam.

Every product has a lifecycle. The lifecycle of Electronic Health Record (EHR) software begins when it is implemented at your facility and ends when it’s no longer in use. When a facility decides to move to a new EHR, it’s natural to focus planning around the new software system.  However, not considering the legacy EHR can leave you wondering what should happen to all of the historical patient financial and medical data. You have many choices. This article will discuss some of the challenges and options that will influence your cost, legal compliance, and stakeholder satisfaction.

Three common mistakes to avoid when moving to a new EHR

  1. Hanging on to the legacy EHR

Some say: “we will worry about shutting down the old system later after the new EHR is up and going.” Taking that path is risky and expensive.

Consider the cost. Until you get all your historical data off the legacy system, you need to pay vendors licensing and support fees. You may infrequently be using the old system, which makes these fees particularly unwarranted.  In addition, you continue to pay your employees to operate and maintain the old system.

To learn more about retiring Legacy EHRs register for this free live webinar. Industry experts will share Key lessons and Best Practices on data management strategies for EHR system replacements. You can also get answers to your questions about any specific requirements.

Some say, “I will stop paying my old vendor.  I don’t need any more updates or support.” However, sooner or later, hardware and software will break or become incompatible to newer technology. Older systems are an easy target for hackers and thieves.

Over time, your employees will forget passwords, how to navigate the old system or leave for other jobs. Then, when you, a patient, or your boss needs some report from the old system, you are caught short. Over time, data retained on an old, unsupported, infrequently used system increases the risk of being lost, stolen, corrupted, and not accessible by newer technology.

Bottom line: keeping an old, infrequently used system will needlessly eat up your time and money.

  1. Migrating all historical data from the legacy system to the new EHR

Some facilities are surprised to learn that the new EHR vendor will not convert all the historical data to the new computer system.

The new system is organized differently than the legacy system with different data elements and structures. There is never a one-to-one match on data mapping between the old and new systems.

It is difficult to validate the accuracy and completeness of data you want to import from the old system. The new EHR vendor doesn’t want to risk starting with an inaccurate database.

This is a golden opportunity to start with a clean slate. For example, you can take this time to reorganize, re-categorize, re-word codes, and tables. Now is the time to set up master files properly, and to make the system more efficient.

The new EHR vendor will lobby for you to start with a clean slate and populate the new database with only current patients, current balances, and current information.

  1. Ignoring Legal Compliance Requirements

Federal and state laws require healthcare facilities to retain medical and financial reports for 5 to 15 years and make these reports available to patients and others upon request. Keeping these records will help to avoid penalties, fines, and loss of certifications. Consult your compliance office, accountant, and HIPAA director to know Federal, IRS, and state-specific requirements.

Use this Data retention tool to find the retention requirements for your state.

Why data archival is an excellent choice

What are the best practices to deal with historical data? Data from the old system needs to be organized in a safe, secure place so that the information can be found and made readily available to those who need it in a timely fashion. In other words, it needs to be archived.

An archive is a separate system from your new EHR. It contains all your historical data and reports. When users sign into the archive program, depending on their user rights, they may see all or some of the historical reports. The most common functions of the archive system include:

  • Search and query clinical and financial data for “Continuity of Care.
  • Download, view, print, and share reports for “Release of Information.

Archival is a new concept. KLAS research is creating a new product category for this.  Listen to this on-demand webinar from the head of EHR Archive studies at KLAS Research.

In the archive, you can see all patients and their previous charts, medications, treatments, billings, insurance claims, payments, and more.  You will also see the historical vendor, employee, and accounting records.

What type of data goes to the archive? All sorts. You can retain discrete data or non-discrete data, structured data (like SQL, XML, CCDA), or unstructured data that is logically grouped and presented in a human-readable form like pdf reports, Excel spreadsheets, CCD, jpeg, or mp3 files.

Mergers and data consolidation

Archival is essential even when there isn’t a transition to new EHR. During a merger, the new entity frequently wants to consolidate patient financial and clinical data from multiple legacy systems into a common platform. Data archiving may be the best solution for dealing with multiple EMR/EHRs. Archival is less expensive than complex conversion and transformation efforts. Besides lower costs, it allows users to research on consolidated data using business intelligence and analytics tools running on one common unified database.

Outsourcing and vendor selection

Outsourcing has become an increasingly popular option for archival solutions for three reasons – cost, experience, and convenience. IT managers are already stretched to limits of time, resources, and budget.  Outside vendors can save the day by offering services for less cost.

When searching for an archival vendor, consider the following:

  • Experience in extracting data from your legacy systems which are no longer supported
  • Complete turnkey solutions – planning, pilot testing, data conversion, user acceptance, and decommissioning
  • Archival product features and ease of use
  • Great customer references
  • Cost of archiving should only be a fraction of the cost of retaining legacy system

The number one failure when implementing a new EHR is procrastinating the archival of legacy data. Hopefully, you can use a few of these ideas to maximize the benefits of your historical data, minimize costs, and best serve your user constituents.

About Triyam
Triyam delivers expert solutions in EMR / EHR Data Conversion and Archival.

Triyam’s data conversion services help hospitals and clinics to freely migrate from one EHR vendor to another without losing any historical patient data. Triyam’s EHR archival product, Fovea is a vendor neutral, innovative and intuitive platform to store all your legacy data. Fovea includes a powerful search engine and extensive reporting for Business Intelligence and Analytics. Triyam is a proud sponsor of Healthcare Scene.

Scripps Research Translational Institute Partners To Develop AI Applications

Posted on November 2, 2018 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

The Scripps Research Translational Institute has agreed to work with graphics processing unit-maker NVIDIA to support the development of AI applications. The partners plan to forge AI and deep learning best practices, tools and infrastructure tailored to supporting the AI application development process.

In collaboration with NVIDIA, Scripps will establish a center of excellence for artificial intelligence in genomics and digital sensors. According to Dr. Eric Topol, the Institute’s founder and director, AI should eventually improve accuracy, efficiency, and workflow in medical practices. This is especially true of the data inputs from sensors and sequencing, he said in an NVIDIA blog item on the subject.

Scripps is already a member of a unique data-driven effort known as the “All of Us Research Program,” which is led by the National Institutes of Health. This program, which collects data on more than 1 million US participants, looks at the intersection of biology, genetics, environment, data science, and computation. If successful, this research will expand the range of conditions that can be treated using precision medicine techniques.

NVIDIA, for its part, is positioned to play an important part in the initial wave of AI application rollouts. The company is a leader in producing performance chipsets popular with those who play high-end, processor-intensive gaming which it has recently applied to other processor intensive projects like blockchain. It now hopes its technology will form the core of systems designed to crunch the high volumes of data used in AI projects.

If NVIDIA can provide hardware that makes high-volume number-crunching less expensive and more efficient, it could establish an early lead in what is likely to be a very lucrative market. Given its focus on graphics processing, the hardware giant could be especially well-suited to dominate rapidly-emerging radiology AI applications.

We can certainly expect to see more partnerships like this file into place over the next year or two. Few if any IT vendors have enough scientific expertise in-house to make important gains in biotech AI, and few providers have enough excess IT talent available to leverage discoveries and data in this arena.

It will be interesting to see what AI applications development approaches emerge from such partnerships. Right now, much AI development and integration is being done on a one-off basis, but it’s likely these projects will become more systematized soon.

Will UnitedHealth’s New Personal Health Record Make An Impact?

Posted on October 26, 2018 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Though the idea of a personal health record was a hot thing for a while, it didn’t become the fixture of the healthcare market that pundits had predicted. In fact, as many readers will recall, even deep pockets like Google and Microsoft couldn’t get their users to sign on to their PHRs en masse.

One of the main reasons the PHR model didn’t take is that people simply didn’t want to use them. In fact, at least at the time, the PHR was almost entirely a solution in search of a problem. After all, if a health data power user and patient advocate like myself didn’t want one, what hope did PHR backers have of interesting your average Joe Blow in aggregating their health data online?

Over time, however, the personal health data landscape has changed, with patient records becoming a bit more portable. While consumers still aren’t beating down the doors to get their own PHR, those who are interested in pulling together their medical records electronically have better access to their history.

Not only that, wearables makers like Apple and Fitbit are sweetening the pot, primarily by helping people pull self-generated data into their health record. Arguably, patient-generated data may not be as valuable as traditional records just yet, but consumers are likely to find it more interesting than the jargon-laden text found in provider records.

Given recent developments like these, I wasn’t entirely surprised to learn that UnitedHealth Group is picking up the PHR torch. According to an article in MedCity News, the giant payer plans to launch what sounds like an updated PHR platform next year to its 50 million benefited plan members.

Apparently, on an earnings call last week UnitedHealth CEO Dave Wichmann said that the company will launch a “fully integrated and fully portable individual health record” in 2019. Notably, this is not just a data repository, but rather an interactive tool that “delivers personalized next-best health actions to people and their caregivers.”

The new health record will be based on UnitedHealth’s Rally health and wellness platform, which the insurer picked up when it acquired Audax Health in 2014. The platform, which has 20 million registered users, works to influence members to perform healthy behaviors in exchange for the incentive dollars,

Over time, Wichmann said, UHG intends to build Rally into a platform which collects and distributes “deeply personalized” health information to individual members, MedCity reported. The idea behind this effort is to highlight gaps in care and help patients assess the care that they get.  Wichmann told earnings call listeners that the platform data will be packaged and presented to clinicians in a form similar to that used by existing EHRs.

UHG’s plans here are certainly worth keeping an eye on over the next year or two. I have no doubt that the nation’s largest commercial payer has some idea of how to format data and make it digestible by systems like Cerner and Epic.

But while patients have become a bit more familiar with the benefits of having their health data on hand, we’re not exactly seeing consumers stampede the providers demanding their own health record either, and I’m far from convinced that this effort will win new converts.

My skepticism comes partly from first-hand experience. As a recent UnitedHealth beneficiary, I’ve used the Rally application, and I didn’t find it all that motivating. Honestly, I doubt any online platform will make much of an impact on patient health on its own, as the reasons for many health issues are multifactorial and can’t be resolved by handing one of us a few Rally bucks.

Personal gripes aside, though, the bigger question remains whether consumers think they’ll get something valuable out of using the new UHG tool. As always, you can never count on them coming just because you built it.

AMA Releases Great Guide To Digital Health Implementation

Posted on October 25, 2018 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

In the past, I’ve been pretty hard on the AMA when it comes to digital health. Last year I gave the organization a particularly hard time when it rolled out its Physician Innovation Network platform, which is designed to help physicians network directly with health tech firms, as it seemed to be breaking little to no ground.

However, to be fair the AMA has been a relatively quiet but solid presence in health IT for quite some time.  Its health IT efforts include cofounding Health2047, which brings together doctors with established health IT companies to help the companies launch services and products, serving as one of four organizations behind mHealth app standards venture Xcertia and managing a student-run biotechnology incubator in collaboration with Sling Health.

But what it hasn’t done so far, at least to date, has been to offer physicians any hands-on guidance on using emerging health IT. Now, at long last, the AMA has taken the plunge, releasing a guide focused on helping physicians roll out digital health technology in their practice. At least this time around, I have to give the organization a high five.

The new guide takes a lifecycle perspective, helping practices work through the digital health implementation process from preparations to rollout to gathering data on the impact of the new technology. In other words, it lays out the process as a feedback loop rather than a discrete event in time, which is smart. And its approach to explaining each step is concise and clean.

One section identifies six straightforward steps for choosing a digital health technology, including identifying a need, defining success early on in the process, making the case for political and financial buy-in, forming the team, evaluating the vendor and executing the vendor contract.

Along the way, it makes the important but often-neglected point that the search should begin by looking at the practice’s challenges, including inefficiencies, staff pain points or patient health and satisfaction problems. “The focus on need will help you avoid the temptation to experiment with new technologies that ultimately will result in tangible improvements,” the guide notes.

Another offers advice on tackling more immediate implementation issues, including steps like designing workflows, preparing the care team and partnering with the patient. This section of the report differs from many of its peers by offering great advice on building workflow around remote patient monitoring-specific requirements, including handling device management, overseeing patient enrollment and interactions, and assuring that coding and billing for remote patient management activities is correct and properly documented.

The guide also walks practices through the stages of final implementation, including the nature of the rollout itself, evaluating the success of the project and scaling up as appropriate. I was particularly impressed by its section on scaling up, given that most of the advice one sees on this subject is generally aimed at giant enterprises rather than typically smaller medical practices. In other words, it’s not that the section said anything astonishing, but rather that it existed at all.

All told, it’s great to see the AMA flexing some of the knowledge it’s always had, particularly given that the report is available at no cost to anyone. Let’s hope to see more of this in the future.

Rolling Over Mountains – An Interview with Niko Skievaski, President of Redox

Posted on October 16, 2018 I Written By

Colin Hung is the co-founder of the #hcldr (healthcare leadership) tweetchat one of the most popular and active healthcare social media communities on Twitter. Colin speaks, tweets and blogs regularly about healthcare, technology, marketing and leadership. He is currently an independent marketing consultant working with leading healthIT companies. Colin is a member of #TheWalkingGallery. His Twitter handle is: @Colin_Hung.

Over the past year I have been following the success of Redox and I have read many articles about the entrepreneurial journey of their President and Co-Founder, Niko Skievaski. I recently had the chance to sit down with him at the MGMA18 conference in Boston.

Rather than revisit the same questions that have been covered in dozens of other articles, I wanted to go in a different direction. I wanted to learn more about Skievaski- the-person rather than Skievaski-the-entrepreneur and I wanted to hear Skievaski’s opinion on the state of the healthcare as an ecosystem.

The latter is something that we have been investigating here at Healthcare Scene. For more details, see John Lynn’s recent post about MEDITECH’s app development environment (Greenfield) and my article exploring whether EHR companies are difficult to work with.

Skievaski and I had a wide-ranging conversation. I hope you enjoy it.

You and I met briefly at the Redox party at HIMSS18 earlier this year. I just want to thank you for your hospitality.

You’re welcome. We love our taco parties at Redox. I’m glad you enjoyed the fiesta.

I understand that you recently moved from Madison, WI to Boulder, Colorado. Why the move?

I lived in Madison for 10 years. I was working for EPIC during that time so it made sense to be there. But I recently decided that I needed a few more mountains in my life so I moved to Boulder.

All through college I raced mountain bikes and I wanted to get back to that. Madison does have a few rolling hills which are fun to ride down, but there’s no comparison to biking down a mountain. So I moved to Boulder for the mountain biking.

You’re from Canada right? [Yes] I was up in British Columbia for two months in the summer last year just mountain biking the trails up there. That was my first real experience being in Canada for an extended period of time. It was fun. You guys are really chill up there in Vancouver.

There are many players in the data integration space. Some have been in the business for decades. Why has Redox succeed in capturing the buzz while others haven’t?

We do things fundamentally differently than existing vendors in the integration space.

In the status quo, you implement an EHR and you need upwards of 400 interfaces to connect it to various other systems in your hospital. So you go out and hire 5-20 interface analysts to sit around all day and code the interfaces you need. You do that a few times, like we did at Epic, and you realize that you are building the same interface over and over again for different health systems. It is literally is the same interface.

Redox is based on the premise that you only should have to build the interface once for all healthcare systems. Once it’s built, others can leverage that work too. For example, we connect Brigham and Women’s ADT feed to Redox. We mapped it. We know where all the fields are. And we’ve done the same with hundreds of other health systems. So if there is any reason that Brigham wants to share their info with any of those other health systems we can facilitate it very easily.

Legacy players didn’t grow up in the cloud so they don’t think like we do. They come from a world of on-premise integration and at a time when healthcare organizations wanted to do all the interface work themselves. It’s a different world now.

I guess you can say that we’re getting the attention because we are solving the problem so differently than everyone else.

One of the interesting things about Redox is that you don’t sell to healthcare organizations. Instead you focus exclusively on HealthIT vendors. Why is that?

We started by working with HealthIT startups that knew how to build in the cloud but didn’t know anything about HL7 and didn’t want to. Yet these companies needed to connect to their customers’ EHR systems.

Without that integration, healthcare organizations wouldn’t buy these amazing cloud apps because of the lack of easy connectivity to their existing systems. In that equation, the incentive lies with the HealthIT company. They are the ones that want to solve the issue of connectivity more than the healthcare organization does. So we target companies that need this help and we go to their customers, get connected to the data and make It easy for the new company to focus on what they do best – which isn’t data integration.

The first project we do with a health system is very much like a standard integration project. The second project is where things get excited because we use that exact same interface we built the first time. There’s really no work to be done by the organization. That’s how we scale.

Is there an ideal type of HealthIT company that Redox likes to work with?

With certain vendors who have the right multi-tenant architecture, like PointClickCare, we can just connect with them once and they can then provision to their customers with a flip of a switch. Any PointClickCare location that wants integration, they can just click and make it happen. Together we make it very easy for a PointClickCare customer to connect with HIEs and the healthcare organizations that they work with.

Basically any HealthIT vendor that is truly cloud-based and that has embraced the concept of having a single platform for everyone is an ideal fit for Redox. Of course, we’re willing to talk to anyone to try and find a solution, but if you are cloud-based HealthIT vendor we should really be talking.

Can you give me an example of an advantage Redox enjoys because you are cloud-based?

By being in the cloud we essentially become the cloud interface for health systems to connect to cloud apps. Vendors come to us because we make it easy for them to get the data they need. Healthcare organizations push cloud vendors they want to work with to us because they won’t have to do any work to connect that new app if that vendor signs on with Redox.

Where things get really interesting, and exciting for Redox, is when we can use our cloud platform to facilitate conversations between vendors and their common customers without the need to go all the way back to that customer’s EHR as the focal point of integration.

For example, say there is a cloud-based scheduling app that allows patients to see and book appointments online. Let’s say they are a Redox customer. Now let’s say there is a telemedicine app that allows healthcare organizations to offer telehealth visits and it reads/writes appointment data directly into the organization’s EHR. Say this telemedicine company is a Redox customer too. So if the healthcare org wants to offer Telemedicine appointments through that scheduling app, the two companies can just integrate through Redox rather than use the EHR as the point of integration because we have all the necessary information running through our platform. This would speed up the transaction and make the patient experience more seamless.

This level of integration is just not possible without being in the cloud.

One of the topics we have explored recently at Healthcare Scene is how difficult it is (or isn’t) to work with EHR companies like Epic, Cerner and Allscripts. What are your thoughts on this? Are EHR companies hard to work with?

I would say, in general, EHR companies get a bad rap. I worked at Epic and I have to say that being inside Epic you don’t realize that people outside think you are difficult to work with. We worked hard to give our customers good service. Epic supports their customers, which are health systems. If a system wants to integrate with an application, then Epic people are more than happy to make it happen. They will put together a project team to support that initiative.

I think that as long as the health system is driving the conversation, EHR companies can be easy to work with.

The challenging part is when there is no customer in between. Say you are a HealthIT vendor and you want to go strike up a deal with an EHR company, like Epic. You have to realize that it’s nearly impossible for that EHR company to assess you as HealthIT vendor. They can’t tell if you are a good vendor or a bad one. If you are an established player or someone with an idea on the back of a napkin. The only way they can tell is if they go ask their customers – the health systems. Because of this, their traditional response has been: “Yes, happy to work with you, but we need to have one of our customers on board to prove this will work.” This can be perceived as being difficult to work with.

When we started Redox we didn’t go immediately knocking on Epic’s door and asking our friends to partner with us. Instead we went out and found a mutual customer to work with so that we would have a proof point when we did approach them.

I actually think it is easier to work with large EHR companies versus smaller ones. The larger companies have more invested in each of their customers and are more apt to work on projects that their customers want to do. Smaller EHR companies are constrained by resources and often don’t have the infrastructure to support integration projects in a timely manner. The good news is that things are changing. We’re seeing a lot more of the small EHR companies come out with developer programs, APIs and partner exchanges. I think they understand the need for their systems to be open.

Is the lack of interoperability a technological issue or is it simply an unwillingness to collaborate?

Neither. It’s a business model problem.

There is no business model that drives healthcare organizations to share their data. No one bats an eye about the lack of interoperability in the consumer world. Walmart doesn’t share their customer data with Target even though there are many people buy from both retailers. If they did share data, they would just be stealing each other’s customers. Healthcare organizations are in competition with each other so they aren’t really incentivized to share data with each other, but give them a useful app in between and all of a sudden they will open up their data.

Interoperability is the right thing to do, but it’s a hard thing to do.

What do you wish you could do with an EHR company that you cannot do today?

The user interface (UI) of EHRs are locked down. I wish EHR companies were more open to change workflow or add buttons to their UIs to make things a more seamless.

I totally understand why they don’t allow it. The workflow in an EHR has an impact on patient safety as well as on outcomes, so you wouldn’t want just any vendor to be able to make UI changes on a whim. But it would be great if there was a way to do something with the UI to make it easier for the end user.

For example, if you are doing something in the workflow, it would be fantastic if you could add a button to the UI that launched a 3rd party app from within the EHR. Say a clinician is doing a chart review and they want to be able to see the latest data from a remote patient monitoring tool. Imagine if that clinician could click a button and launch the actual monitoring app rather than that app having to ship its data to the EHR and have it stored/rendered in a poor format – like a table of numbers or a rudimentary chart. Why not let the native app show the data in all it’s glory using an interface designed specifically for it?

What’s next for Redox?

We want to push the healthcare industry to a point where we don’t even think about integration anymore. We want to see an end to integration projects. Think about all the time and resources that would be saved if you don’t have to use a custom interface each time. If we can do that we can drive down the cost of healthcare for everyone. To do that we just have to keep growing the nodes on our network and be a good partner to everyone.

 

This may sound like a tall order, but maybe not for someone who rolls over mountains on a bike for fun.

[Update: Niko Skievaski’s title which was incorrectly reported as CEO. Skievaski is Redox’s President and Co-Founder]

Is FHIR Adoption At A Turning Point, Or Is This Just More Hype?

Posted on October 8, 2018 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Over the last few years, healthcare industry players have continued to experiment with the use of HL7 FHIR to solve key interoperability problems.

Perhaps the most recent efforts to do so is the Da Vinci Project, which brings together a group of payers, health IT vendors, and providers dedicated to fostering value-based care with FHIR. The group has begun work on two test cases, one addressing 30-day medication reconciliation and the other coverage requirements discovery.

This wasn’t big news, as it doesn’t seem to be doing anything that new. In fact, few if any of these projects — of which there have been many — have come close to establishing FHIR firmly established as a standard, much less fostering major change in the healthcare industry.

Now, a new analysis by the ONC suggests that we may finally be on the verge of a FHIR breakthrough.

According to ONC’s research, which looked at how health IT developers used FHIR to meet 2015 Edition certification requirements, roughly 32% of the health IT developers certified are using FHIR Release 2, and nearly 51% of health IT developers seem to be using a version of FHIR combined with OAuth 2.0.

While this may not sound very impressive (and at first glance, it didn’t to me), the certified products issued by the top 10 certified health IT developers serve about 82% of hospitals and 64% of clinicians.

Not only that, big tech companies staking out an expanded position in healthcare are leveraging FHIR 2, the ONC notes. For example, Apple is using a FHIR-based client app as part of its healthcare deployment.  Amazon, Alphabet, and Microsoft are working to establish themselves in the healthcare industry as well, and it seems likely that FHIR-based interoperability will come to play a part in their efforts.

In addition, CMS has shown faith in FHIR as well, investing in FHIR through its Blue Button 2.0,  a standards-based API allowing Medicare beneficiaries to connect their claims data to applications, services, and research programs.

That being said, after citing this progress, the agency concedes that FHIR still has a way to go, from standards development implementation, before it becomes the lingua franca of the industry. In other words, ONC’s definition of “turning point” may be a little different than yours or mine. Have I missed something here?

Look, I don’t like being “that guy,” but how encouraging is this really? By my standards at least, FHIR uptake is relatively modest for such a hot idea. For example, compare FHIR adoption of AI technology or blockchain. In some ways, interoperability may be a harder “get” than blockchain or AI in some ways, but one would think it would be further along if it were completely practical. Maybe I’m just a cynic.

Patient Billing And Collections Process Needs A Tune-Up

Posted on October 1, 2018 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

A new study from a patient payments vendor suggests that many healthcare organizations haven’t optimized their patient billing and collections process, a vulnerability which has persisted despite their efforts to crack the problem.

The survey found that while the entire billing collections process was flawed, respondents said that collecting patient payments was the toughest problem, followed by the need to deploy better tools and technologies.

Another issue was the nature of their collections efforts. Sixty percent of responding organizations use collections agencies, an approach which can establish an adversarial relationship between patient and provider and perhaps drive consumers elsewhere.

Yet another concern was long delays in issuing bills to patients. The survey found that 65% of organizations average more than 60 days to collect patient payments, and 40% waited on payments for more than 90 days.

These results align other studies that look at patient payments, all of which echo the notion that the patient collection process is far from what it should be.

For example, a study by payment services vendor InstaMed found that more than 90% of consumers would like to know what the payment responsibility is prior to a provider visit. Worse, very few consumers even know what the deductible, co-insurance and out-of-pocket maximums are, making it more likely that the will be hit with a bill they can’t afford.

As with the Cedar study, InstaMed’s research found that providers are waiting a long time to collect patient payments, three-quarters of organizations waiting a month to close out patient balances.

Not only that, investments in revenue cycle management technology aren’t necessarily enough to kickstart patient payment volumes. A survey done last year by the Healthcare Financial Management Association and vendor Navigant found that while three-quarters of hospitals said that their RCM technology budget was increasing, they weren’t necessarily getting the ROI they’d hoped to see.

According to the survey, 77% of hospitals less than 100 beds and 78% of hospitals with 100 to 500 beds planned to increase their RCM spending. Their areas of investment included business intelligence analytics, EHR-enabled workflow or reporting, revenue integrity, coding and physician/clinician documentation options.

Still, process improvements seem to have had a bigger payoff. These hospitals are placing a lot of faith in revenue integrity programs, with 22% saying that revenue integrity was a top RCM focus area for this year. Those who would already put such a program in place said that it offered significant benefits, including increased net collections (68%), greater charge capture (61%) and reduced compliance risks (61%).

As I see it, the key takeaways here are that making sure patients know what to expect financially and putting programs in place to improve internal processes can have a big impact on patient payments. Still, with consumers financing a lot of their care these days, getting their dollars in the door should continue to be an issue. After all, you can’t get blood from a stone.