Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

How to Introduce Microservices in a Legacy Healthcare Environment

Posted on May 31, 2018 I Written By

The following is a guest blog post by Nick Vennaro, Co-founder of Capto Consulting.

Healthcare as a whole is finding new ways to use technology to improve population health and patient experience. Population health is looking for a spectrum of precision in patient and provider data as well as clinical cost metrics and matching that data to patient communication, metrics and clinical outcomes. Patient experience requires streamlining information that is both timely and personalized, which is hard to accomplish with monolithic systems.

A monolithic system is usually one that has grown over many years and performs numerous functions that are not architecturally separated. These systems tend to be brittle and not easily changed.  The proliferation of mergers and acquisitions in healthcare further exacerbates the complexity of operating multiple monolithic systems within a healthcare network. It is not unheard of to operate 5, 8 or even 12 billing systems in parallel, because combining them would take so much more time, and it is more cost effective to let them operate individually.

An increasingly popular architectural style known as microservices are much better equipped to help healthcare organizations move forward rapidly than are the current monolithic, unstructured and difficult to maintain systems. While currently, no consensus exists on how to define microservices, it’s generally agreed that they are an architectural pattern that is composed of loosely coupled, autonomous, and independently deployable services that communicate using a lightweight mechanism such as HTTP/REST.

Now is the time for healthcare organizations to be investigating how best to introduce microservices in their legacy environments if they expect to realize a digital transformation. This is particularly important to enterprises that need to make frequent changes to their systems and where time-to-market is paramount.

The benefits and potential hurdles associated with adopting microservices are well documented. On the plus side, the modular and independent nature of microservices enables improvements in efficiency, scalability, speed and flexibility—all the features a nimble healthcare enterprise requires.  Detractors, however, frequently point to management and security challenges, especially when they pertain to customer-facing applications and services.   These challenges can be overcome with due diligence and planning.

Like virtually all technology decisions, it’s critical to balance risk with reward and, when it comes to microservices, embracing an evolutionary approach and process. After all, lessons can be learned from both success and failure, and the same is true for implementing microservices that can increase product and service quality, ensure systems are more resilient and secure, and drive revenue growth. This blog post will explain how business and technology leaders can smoothly and successfully introduce microservices in a legacy environment.

It’s all about the monkey

A key requirement of microservices design is to focus service boundaries around application business boundaries. A keen awareness and understanding of service and business boundaries helps right-size services and keeps technology professionals focused on doing one thing and doing it very well.

Astro Teller, the “Captain of Google Moonshots” humorously advocates that companies “tackle the monkey first” meaning they should avoid allocating all of their resources on the easy stuff and instead start by addressing the hard problems. The monkey, when deploying microservices in a large, established environment, is understanding and decomposing the legacy systems.

Decompose the legacy environment by identifying seams

In his book, “Working Effectively with Legacy Code,” Michael Feathers presented the idea of a seam as a way to identify portions of code that can be modified without affecting the rest of the code base. This notion of seams can be extended as a method to divide a monolithic system into bounded contexts from which services can be quickly and seamlessly created.

Uncovering seams in applications and building bounded contexts is an important first step in breaking down the monolith. Here are two steps to identify seams:

  • Interview domain experts. This is a key step to learning where the seams are and identifying bounded contexts. Having domain experts that understand what the business should be doing not just what the system currently does is critically important.
  • Understand the organizational structure – Often, organizational structure will provide clues to where the seams can be found.

Once the boundaries are identified, along with the programming language and environment that support them, creating packages and sub-packages that contain these bounded contexts should closely follow. This approach will afford a careful analysis of package usage and dependencies, which are paramount to fully and quickly understanding and ensuring that testing and instrumenting code is being done properly.

Healthcare is a prime candidate for using microservices to find the seams and decompose the monolithic infrastructure. It allows modernization as well as merging technologies without a complete and disruptive overhaul of the monolith at one time. This will allow the healthcare organization more flexibility and ability to compete on many levels, it’s a relatively fast route to a more agile delivery of population health and patient experience.

About Nick Vennaro
Nick Vennaro is cofounder of Capto, a management consulting firm. Nick has more than 25 years of experience leading enterprise-scale technology and business management initiatives for Fortune 500 companies. Nick will be presenting May 31 at the Healthcare IT Expo on “Using Outcomes-based Contracts to Increase Performance and Innovation.”

Legacy Health IT Systems – So Old They’re Secure

Posted on April 21, 2017 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I’ve been thinking quite a bit about the ticking time bomb that is legacy healthcare IT systems. The topic has been top of mind for me ever since Galen Healthcare Solutions wrote their Tackling EHR & EMR Transition series of blog posts. This is an important topic even if it’s not a sexy one.

I don’t think we need to dive into the details of why legacy healthcare IT systems are a security risk for most healthcare organizations. Hospitals and health systems have hundreds of production systems that they’re trying to keep secure. It’s not hard to see why legacy systems get forgotten. Forgotten systems are ripe for hackers and others that want to do nefarious things.

Although, I did hear someone recently talking about legacy health IT systems who said that they had some technology in their organization that was so old it was secure again. I guess there’s something to say about having systems that are so old that hackers don’t have tools that can breach such old systems or that can read old files. Not to mention that many of these older systems weren’t internet connected.

While I find humor in the idea that something could be so old that it’s secure again, that’s not the reality for most legacy systems. Most old systems can be breached and will be breached if they’re not considered “production” when it comes to patching and securing them.

When you think about the costs of updating and securing your legacy systems like you would a production system for security purposes, it’s easy to see why finding a way to sunset these legacy systems is becoming a popular option. Sure, you have to find a way to maintain the integrity of the data, but the tools to do this have come a long way.

The other reason I like the idea of migrating data from a legacy system and sunsetting the old system is that this often opens the door for users to be able to access the legacy data. When the data is stored on the legacy system it’s generally not used unless it’s absolutely necessary. If you migrate that legacy data to an archival platform, then the data can be used by more people to influence care. That’s a good thing.

Legacy health IT systems are a challenge that isn’t going to go away. In fact, it’s likely to get worse as we transition from one software to the next. Having a strategy for these legacy systems which ensures security, compliance, and extracts value is going to be a key to success for every healthcare organization.