Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

Providers Tell KLAS That Existing EMRs Can’t Handle Genomic Medicine

Posted on November 26, 2018 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Providers are still in the early stages of applying genomics to patient care. However, at least among providers that can afford the investment, clinical genomics programs are beginning to become far more common, and as a result, we’re beginning to get a sense of what’s involved.

Apparently, one of those things might be creating a new IT infrastructure which bypasses the provider’s existing EMR to support genomics data management.

KLAS recently spoke with a number of providers about the vendors and technologies they were using to implement precision medicine. Along the way, they were able to gather some information on the best practices of the providers which can be used to roll out their own programs.

In its report, “Precision Medicine Provider Validations 2018,”  KLAS researchers assert that while precision medicine tools have become increasingly common in oncology settings, they can be useful in many other settings.

Which vendors they should consider depends on what their organization’s precision medicine objectives are, according to one VP interviewed by the research firm. “Organizations need to consider whether they want to target a specific area or expand the solutions holistically,” the VP said. “They [also] need to consider whether they will have transactional relationships with vendors or strategic partnerships.”

Another provider executive suggests that investing in specialty technology might be a good idea. “Precision medicine should really exist outside of EMRs,” one provider president/CEO told KLAS. “We should just use software that comes organically with precision medicine and then integrated with an EMR later.”

At the same time, however, don’t expect any vendor to offer you everything you need for precision medicine, a CMO advised. “We can’t build a one-size-fits-all solution because it becomes reduced to meaninglessness,” the CMO told KLAS. “A hospital CEO thinks about different things than an oncologist.”

Be prepared for a complicated data sharing and standardization process. “We are trying to standardize the genomics data on many different people in our organization so that we can speak a common language and archive data in a common system,” another CMO noted.

At the same time, though, make sure you gather plenty of clinical data with an eye to the future, suggests one clinical researcher. “There are always new drugs and new targets, and if we can’t test patients for them now, we won’t catch things later,” the researcher said.

Finally, and this will be a big surprise, brace yourself for massive data storage demands. “Every year, I have to go back to our IT group and tell them that I need another 400 terabytes,” one LIS manager told the research firm.” When we are starting to deal with 400 terabytes here and 400 terabytes there, we’re looking at potentially petabytes of storage after a very short period of time.”

If you’re like me, the suggestion that providers need to build a separate infrastructure outside the EMR to create precision medicine program is pretty surprising, but it seems to be the consensus that this is the case. Almost three-quarters of providers interviewed by KLAS said they don’t believe that their EMR will have a primary role in the future of precision medicine, with many suggesting that the EMR vendor won’t be viable going forward as a result.

I doubt that this will be an issue in the near term, as the barriers to creating a genomics program are high, especially the capital requirements. However, if I were Epic or Cerner, I’d take this warning seriously. While I doubt that every provider will manage their own genomics program directly, precision medicine will be part of all care at some point and is already having an influence on how a growing number of conditions are treated.

Cybersecurity Confidence and Cybersecurity Maturity

Posted on November 21, 2018 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Cybersecurity is the number one topic on most healthcare CIOs minds. It’s the number one thing that keeps them up at night. No doubt, it’s become one of the most challenging parts of their job.

These facts were illustrated really well in this chart that CIO, David Chou, shared on CIOs self reported confidence in IT security.

There’s been a drop in security trust in almost every industry, but the drop in healthcare’s trust in IT security is dramatic. As David Chou mentions, it’s likely due to all the incidents of ransomware and malware that have been all over healthcare.

What then can an organization do to improve this situation? What’s the right approach to be able to improve your confidence in your IT security?

David Chou also offered a great response to these questions in this cybersecurity maturity chart and the key to successfully implementing what’s in this chart:


There’s little doubt that effective cybersecurity takes the entire organization being on board. It can’t just be the job of the CIO or CEO or CISO. If that’s the case, it will fail and a breach will occur.

Looking at this chart, how is your organization doing on cybersecurity? How mature are your efforts? Is there room to improve?

Healthcare Interoperability is a Joke

Posted on November 20, 2018 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Did you see the big news last month about healthcare interoperability? That’s right, Carequality announced support for FHIR. Next thing you know, we’re going to get an announcement that CommonWell is going to support faxing.

Seriously, healthcare interoperability is a joke.

The reality is that no EHR vendor wants to do interoperability. And it’s not saying anything groundbreaking to say that Carequality and CommonWell are both driven by the EHR vendors. Unfortunately, I see these organizations as almost a smokescreen that allows EHR vendors to not be interoperable while allowing them to say that they’re working on interoperability.

I’d describe current interoperability efforts as a “just enough” approach to interoperability. EHR vendors want to do just enough to appease the call for interoperability by the government and other patient organizations. It’s not a real effort to be interoperable. That’s most EHR vendors. A few of them are even using interoperability as a weapon to keep vendors out and some are looking at interoperability as a new business model.

Just to be clear, I’m not necessarily blaming the EHR vendors. They’re doing what their customers are asking them to do which is their highest priority. Until their customers ask for interoperability, it’s not going to happen. And in many respects, their customers don’t want interoperability. That’s been the real problem with interoperability since the start and it’s why grand visions of interoperability are unlikely to happen. Micro interoperability, which is how I’d describe what’s happening today, will happen and is happening.

If EHR vendors really cared about being interoperable, they’d spend the time to see where interoperability would lower costs, improve care, and provide a better patient experience. That turns out to be a lot of places. Then, they’d figure out how to make that possible and still secure and safe. Instead, they don’t really do this. The EHR vendors just follow whatever industry standard is out there so they can say they’re working on interoperability. Ironically, many experts say that the industry standards aren’t standard and won’t really make a big impact on interoperability.

There are no leaders in healthcare interoperability. There are just followers of the “just enough” crowd.

Let’s just be honest about what’s really possible when it comes to EHR vendors and healthcare interoperability. There is some point to point use cases that are really valuable and happening (this feels like what FHIR is doing to me). In a large health system, we’re seeing some progress on interoperability within the organization. We’re starting to see inklings of EHR vendors opening up to third-party providers, but that still has a long ways to go. Otherwise, we’re exchanging CCDs, faxes, and lab results.

Will we see anything more beyond this from EHR vendors? I’m skeptical. Let me know what you think in the comments on on Twitter with @HealthcareScene.

Process Re-engineering Can Produce Results, Lumeon Finds

Posted on November 19, 2018 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

A rigorous look at organizational processes, perhaps bolstered by new technology, can produce big savings in almost any industry. In health care, Lumeon finds that this kind of process re-engineering can improve outcomes and the patient experience too–the very Triple Aim cited as goals by health care reformers.

A bad process, according to Robbie Hughes, Founder and CEO for Lumeon, can be described as, “The wrong people have the wrong information at the wrong time.” One example is a surgery unit that Lumeon worked with on scheduling surgeries. The administrative staff scheduled the surgeries based on minimal contact with the clinicians–a common practice throughout the industry that might seem efficient. But unfortunately, people who are uninformed about the clinical aspects of the surgery make sub-optimal plans, directly leading to poorer outcomes. The administrative staff don’t use rooms and other resources effectively, and stumble over risks that the clinicians could have warned them about. Lumeon uncovered the problem during a single morning meeting with this particular hospital. By enabling the clinicians to better coordinate with the scheduling staff, the surgery unit more than doubled its presurgical screening capacity without asking for increased funding.

I recently wrote about a controversy over patient loads that erupted into a major political controversy (rarely a formula for rational process engineering). Thus, when talking to Hughes, I was sensitized toward the importance of good processes. The health care field is stuck in the kind of blindness toward process seen in the fictional medieval setting of Monty Python’s Jabberwocky, but some of the more forward-thinking institutions are doing the hard work of streamlining their processes. These include:

  • Cleveland Clinic, which reorganized their recommendations for patient behavior before and after surgery, called Enhanced Recovery After Surgery (ERAS)
  • BUPA, a major British insurer that has a formal process model
  • U.S. giant Kaiser Permanente, which uncovered enormous waste when clinicians search for supplies

The higher you rise above the scene, and the more you can think about the system rather than one silo, the more efficient you can become. The Kaiser inquiry covered the entire supply chain for each hospital. BUPA is fortunate to possess actuarial information that help it assign a predicted cost and likely outcomes to cancer cases, where the company can assign caretakers to patients as needed throughout the whole recovery process.

Another useful scope is the sequence leading from a patient’s initial contact to a successful outcome, a process or “pathway” that goes far outside the hospital’s walls and beyond the time in the doctor’s office or surgical unit.

Typically, Hughes says, one day is enough to find process improvements. Through interviews and through observation–because staff misunderstand and misrepresent their own processes–Lumeon can develop a process map, expressed visually like the post-operative pathway in the following figure.

Typical pathway, describing post-operative process

*Click to see Full Size – Typical pathway, describing post-operative process

The best motivation for taking a longitudinal view, of course, is risk-sharing. A doctor who will be rewarded or penalized for outcomes will be willing to invest in producing better outcomes. Similarly, an insurer such as BUPA will be motivated to reduce readmissions if it has a long-term responsibility for patients. Bundled payments are a round-about, highly diluted approach to risk-sharing.

Fee-for-service models mean having to define a deliverable that everybody can understand and achieve. A bundled payments model is far from this. UK outcome measures truly place risk on the provider. In the US, bundled payments dilute risk.

But Lumeon can find ways to improve processes even within a fee-for-service model by enabling health organizations to guide patients more successfully through their entire health journey. For instance, with the company’s Care Pathway Management solution, doctors can remind patients to come back in five years for a colonoscopy, thus potentially saving lives while ensuring the institution’s own revenue stream under fee-for-service. Other simple goals can be to make sure the patient has a complete list of tasks prior to surgery (such as not to drink water in the morning) in order to eliminate late starts or last-minute cancellations, which are very expensive as well as frustrating. Predictably, Lumeon finds a certain set of common problems over and over, regardless of medical disciplines or institutions. Hospitals sometimes optimize within each department, but not across multiple departments. Usually this change comes down to maximizing compliance with a known protocol, rather than trying to use sophisticated artificial intelligence techniques to look for new approaches that theoretically offer benefits.

Lumeon also works to minimize disruptions to existing workflows. Large institutions such as Kaiser can tell everybody to adopt a whole new way of doing things, but staff within most institutions might be more resistant. The staff can still be trained to do things like create quality standards and follow them, or call patients at certain intervals or after a procedure, but these processes need training before they become reliable and predictable. Culture and habit, not technology, turn out to be the biggest barriers to process improvement.

Software, too, must be molded to current ways of working. We all experience little tolerance in our work or everyday lives for non-intuitive computer interfaces that appear to be putting barriers in our way. For instance, I have never forgiven my phone vendor for changing the most common activity I do on the device (turning airplane mode on and off) from a three-step process to an eight-step process.

The most effective persuasion is evidence-based. If an institution can get one department or doctor to adopt a new process, and can then collect data showing that it improves outcomes and cut costs, other departments are likely to follow along. In contrast, staff are likely to be oblivious to a study from a journal with statistics from clinical trials, no matter how scientifically valid the study may be. Hughes says that resistance to change is often attributed to doctors, but he thinks that this resistance is primarily caused by change being forced on them without evidence. With proper, objective data supporting a change, doctors are often the first to lead new initiatives in the spirit of delivering better patient care.

New kinds of records are needed to keep track of outcomes and make use of the valuable data they provide. Ideally, Lumeon would integrate with electronic medical records, but the EMRs are rarely set up to hold and provide such information. Instead, Lumeon installs software on top of the EMR, calling their addition an “agility layer.”

Hughes identified two common practices that can interfere with process improvement. The first is the growing focus around “patient engagement,” which can be as superficial as sending reminders for online check-ins or as fundamental as giving patients access to data.

However, patient engagement by itself is not sufficient to deliver meaningful process improvement. Patient engagement measures can make a difference as an integral part of an effective operational process. For instance, there is no point in getting patients to fill in data online if it’s not going to be used by the clinicians.

Second, the focus on documenting compliance with standards, such as meaningful use, often becomes a documentation exercise rather than a way of improving care. Unfortunately, this is a problem that is seen all over the world by well-intentioned governments and funders who want to offer incentives for good behavior by paying for better processes. But this all too often ends in additional costs and effort to administer the care, rather than actually focusing on the basics.

New INFRAM Model Creates Healthcare Infrastructure Benchmarks

Posted on November 14, 2018 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

During the frenzy that was healthcare organizations rushing to implement EHRs and chase government money, it was amazing to see so many other projects get left behind. One of the biggest areas that got left behind was investments in IT infrastructure. All the budget was going to the EHR and so the infrastructure budgets often got cut. There were some exceptions where upgrades to infrastructure were needed to make the EHR work, but most organizations I know chose to limp along with their current infrastructure and used that money to pay for the EHR.

Given this background, I was quite intrigued by the recent announcement of HIMSS Analytic’s INFRAM (Infrastructure Adoption Model). This new model focuses on a healthcare organization’s infrastructure and whether it’s stable, manageable, and extensible. I like this idea since it’s part of the practical innovation we talk about in our IT Dev Ops category at the EXPO.health Conference. What we’ve found is that many healthcare organizations are looking for infrastructure innovations and the benefits are great.

The INFRAM model has 5 main focus areas:

  • Mobility
  • Security
  • Collaboration
  • Transport
  • Data Center

No doubt these are all areas of concern for any healthcare CIO. Although, I wonder if having all 5 of these in the same model is really the best choice. A healthcare organization might be at a level 6 for secruity, but only at a level 3 for mobility. Maybe that’s just fine for that organization. I guess at the core of this question is whether all of the capabilities of stage 7 are capabilities that are universally needed by all healthcare organizations.

I’m not sure the answer to this, but I think a case can be made that some organizations shouldn’t spend their limited resources to reach stage 7 of the INFRAM benchmark (or even stage 5 for some organizations). If a healthcare organization makes that a priority, it will probably force some purchases that aren’t really needed by the organization. That’s not a great model. If the above 5 focus areas had their own adoption models, then it would avoid some of these issues.

Much like the EMRAM model, the INFRAM model has 7 stages as follows:

STAGE 7
Adaptive And Flexible Network Control With Software Defined Networking; Home-Based Tele-Monitoring; Internet/TV On Demand

STAGE 6
Software Defined Network Automated Validation Of Experience; On-Premise Enterprise/Hybrid Cloud Application And Infrastructure Automation

STAGE 5
Video On Mobile Devices; Location-Based Messaging; Firewall With Advanced Malware Protection; Real-Time Scanning Of Hyperlinks In Email Messages

STAGE 4
Multiparty Video Capabilities; Wireless Coverage Throughout Most Premises; Active/Active High Availability; Remote Access VPN

STAGE 3
Advanced Intrusion Prevention System; Rack/Tower/Blade Server-Based Compute Architecture; End-To-End QoS; Defined Public And Private Cloud Strategy

STAGE 2
Intrusion Detection/Prevention; Informal Security Policy; Disparate Systems Centrally Managed By Multiple Network Management Systems

STAGE 1
Static Network Configurations; Fixed Switch Platform; Active/Standby Failover; LWAP-Only Single Wireless Controller; Ad-Hoc Local Storage Networking; No Data Center Automation

STAGE0
No VPN, Intrusion Detection/Prevention, Security Policy, Data Center Or Compute Architecture

As this new model was announced, I had a chance to talk with Marlon Harvey, Industry Solutions Group Healthcare Architect at Cisco, about the INFRAM model. It was interesting to hear the genesis of the model starting first as an infrastructure maturity model at Cisco and then evolving into the INFRAM model described above. Marlon shared that there had been about 21-24 assessments and 35 organizations involved in developing this maturity model. So, the model is still new, but has had some real world testing by organizations.

I do have some concern about the deep involvement from vendor companies in this model. On the one hand, they have a ton of expertise and experience in what’s out there and what’s possible. On the other hand, they’re definitely interested in pushing out more infrastructure sales. No doubt, HIMSS Analytics is in a challenging position to balance all of this.

That said, a healthcare CIO doesn’t have to be beholden to any model. They can use the model where it applies and leave it behind where it doesn’t. Sure, I love having models like INFRAM and EMRAM to create a goal and a framework for a healthcare organization. There’s real value in having goals and associated recognition as a way to bring a healthcare IT organization together. Plus, benchmarks like these are also beneficial to a CIO trying to convince their board to spend more money on needed infrastructure. So, there’s no doubt some value in good benchmarking and recognition for high achievement. I’ll be interested to see as more CIOs dive into the details if they find that INFRAM is focused on the things they really need to move their organization forward from an infrastructure perspective.

Value Based Care: Successes, Challenges, and Changes – #HITsm Chat Topic

Posted on November 13, 2018 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

We’re excited to share the topic and questions for this week’s #HITsm chat happening Friday, 11/16 at Noon ET (9 AM PT). This week’s chat will be hosted by Matt Fisher (@Matt_R_Fisher) on the topic of “Value Based Care: Successes, Challenges, and Changes”.

The transition of the healthcare industry from fee for service to value based care (or alternative payment methodologies) garners significant attention from regulators, providers, vendors and many others in the industry. To frame the discussion, value based care generally refers to payment for quality, or in other words trying to focus on outcomes. The change represents a substantial shift in the approach to paying for healthcare services in the United States.

While value based care refers to payment for quality as an overarching concept, there are a multitude of means of structuring payment arrangements for quality. Examples include capitated agreements, bundled payments, pay for quality, and others. Common themes around the structures are not paying based on the volume of services, which arguably drives collaborations to break down siloes.

With a few years of value based care under the belt, how have efforts gone and where are those efforts heading? Join the chat to weigh in with your thoughts.

Topics for this week’s #HITsm Chat:
T1: Which value based care models have been successful to date and how do you define success? #HITsm

T2: How are new and/or developing #healthIT tools helping or hindering the ability to transition to value based care? #HITsm

T3: What are misperceptions that have developed around value based care models and how are they inaccurate? #HITsm

T4: What role do Medicare and Medicaid programs have in pushing the industry to value based care and how does the recommitment of CMS impact the change? #HITsm

T5: What changes do you see on the horizon for value based care programs? #HITsm

Bonus: What type of value based care program not currently existing should be developed or implemented? #HITsm

Upcoming #HITsm Chat Schedule
11/23 – No Chat – Thanksgiving Break

11/30 – The Global Impact of Health IT
Hosted by Vanessa Carter (@_FaceSA)

12/7 – TBD
Hosted by Michelle Currie (@mshlcurrie)

12/14 – TBD
Hosted by Claire Pfarr (@clairepfarr) from @OneViewHC and the @Savvy_Coop Community

12/21 – Holiday Break

12/28 – Holiday Break

We look forward to learning from the #HITsm community! As always, let us know if you’d like to host a future #HITsm chat or if you know someone you think we should invite to host.

If you’re searching for the latest #HITsm chat, you can always find the latest #HITsm chat and schedule of chats here.

Combatting Communication Problems in Community Healthcare Clinics

Posted on November 7, 2018 I Written By

The following is a guest blog post by Tom Downes, CEO of Quail Digital.

The notion of a community healthcare clinic is constantly evolving from the traditional model of a local clinic staffed by general practitioners and nurses, serving mainly rural populations. There is now a renewed interest in these organisations and their potential to deliver a more integrated care service within the community. However, in order to successfully make this transition, there is a need to better equip these clinics with the tools to ensure they’re able to cope with the extra demand and the ever-evolving medical treatments that are being practised.

With an estimated 33 million people visiting community healthcare clinics each year, these organisations are an essential part of the healthcare system. Whilst they are investing vital time into evolving their structure and delivering a focused range of medical services, without the right technology in place staff productivity will suffer, hindering their ability to make the most out of not only the current resources available, but any new, innovative resources they decide to invest in.

A collaborative approach

To foster a more productive, collaborative environment, communication should be implemented across the entire team. From diagnostics to preventive treatment, clinical procedure and rehabilitation, delivering a diverse set of services can create a stressful environment, if the team, from receptionist to clinicians, are wasting valuable time trying, without success, to communicate. But as services expand, enabling staff to speak easily with one another to seek answers to questions, locate the right individual and better manage the flow of patients through the appointments process, has become even more important.

Community healthcare clinics traditionally rely on telephones to communicate internally, but these can often go unanswered. Additionally, this device commonly only works when just two people want to communicate with each other, restricting the ability to send messages, updates and instructions to the whole team. Naturally, therefore, the likelihood of missing key information or mishearing a fellow colleague is increased, creating unnecessary stress and delays.

And this dated communication tool will not be able to facilitate the growing numbers of staff working in these clinics. As nearly 62 percent of all community healthcare clinics are in an urban setting they are providing services for extremely dense populations, therefore they require a greater amount of staff to help accommodate this demand. Team this up with the intense competition these urban clinics have with multiple clinics and medical centres serving the same geographic, and the need for a better communication tool that will help them provide a positive experience is even more important.

Clear Communication

Providing clear, discrete communication to all members at reception and in the clinics will have an extremely positive impact on the running of the community healthcare clinic. Lightweight headset technology will help the team working in these clinics to reduce unwanted hold-ups, improve workflow and offer a much improved experience for each of those patients who walk through the door. And with the ability to coordinate easily with one another, the team can become more productive and efficient to ensure they’re prepared for the demands felt by this expanding healthcare system.

Critically, in this most challenging of jobs, adopting a headset system that operates on a single channel will ensure all members of staff are in permanent communication. This way, doctors, nurses or receptionists are able to approach their colleagues who are working in another part of the clinic with any urgent query or question they may have. This immediate and non-obtrusive communication method is particularly important during times of expansion and innovation, as every team member will be learning and adopting new methods and structures.

Conclusion

Community healthcare clinics are evolving and there is now a growing need to implement digital solutions to provide staff with the ability to hear everything clearly, at all times. There are also other daily practices that can help facilitate a more tranquil environment. Along with headset technology, eliminating unnecessary, frantic noise across the clinic will drastically reduce the distractions all doctors, nurses and receptionists have to face. Not only will this have a positive impact on stress-levels, but it will also make it a lot easier to communicate effectively amongst the team. Daily team meetings are also vital for every member of staff in a community healthcare clinic. With a better understanding of everyone’s workload for that day the team will have greater visibility of who is available to assist with other tasks and enquiries.

By implementing communication tools and ensuring greater visibility across the team clinical operational efficiencies will be increased while staff stress levels will be reduced and their wellbeing improved.

About Tom Downes
Tom Downes founded Quail Digital in 1995 to design headset systems for ‘team’ communication. The philosophy being that the easier and more freely a team can speak with each other in the workplace, the better their outcomes, wellbeing and productivity. Quail Digital designs and manufactures systems for the healthcare, retail and hospitality sectors, and has offices in Dallas, TX and London UK. Quail Digital is the leading provider of communications systems in the OR, and a sponsor of Healthcare Scene.

AI in Healthcare – #HITsm Chat Topic

Posted on November 6, 2018 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

We’re excited to share the topic and questions for this week’s #HITsm chat happening Friday, 11/9 at Noon ET (9 AM PT). This week’s chat will be hosted by Jon White @technursejon on the topic of “AI in Healthcare”.

The idea of Artificial Intelligence (AI) isn’t new. We’ve seen robots and intelligent computers in film and on television for decades, and read about them in science fiction novels for even longer. As the processing power of computers and computing devices has taken off, and more and more data is captured from all facets of our lives, the science fiction from our parents’ generation is becoming the reality of today.

Though we may be far from witnessing the androids popularized in film and TV, there are elements of AI that are currently in use in many industries. AI has the potential to drastically change the way we live and work.

In this #HITsm chat, Jon White (@TechNurseJon) will lead a discussion on AI in healthcare, exploring its potential and pitfalls.

Check out the questions for this week’s #HITsm chat below.

Topics for this week’s #HITsm Chat:
T1: Artificial intelligence (AI) is a broad term, covering a variety of technologies. What does “AI” mean to you? How do you define it? #HITsm

T2: What impacts can AI have on healthcare, and how soon do you expect to see it? #HITsm

T3: What impacts do you see AI having on the healthcare and health IT workforce? #HITsm

T4: How can AI be integrated with other technologies to improve the delivery and effectiveness of healthcare? Where would you like to see it integrated? #HITsm

T5: AI relies on a significant amount of data. For many applications in healthcare, much of that data is derived from patient records. How will privacy concerns affect adoption? #HITsm

Bonus: What barriers are there to full-scale AI adoption in the healthcare environment? #HITsm

Upcoming #HITsm Chat Schedule
11/16 – Value Based Care: Successes, Challenges, and Changes
Hosted by Matt Fisher (@Matt_R_Fisher)

11/23 – No Chat – Thanksgiving Break

11/30 – The Global Impact of Health IT
Hosted by Vanessa Carter (@_FaceSA)

12/7 – TBD
Hosted by Michelle Currie (@mshlcurrie)

12/14 – TBD
Hosted by Claire Pfarr (@clairepfarr) from @OneViewHC and the @Savvy_Coop Community

12/21 – Holiday Break

12/28 – Holiday Break

We look forward to learning from the #HITsm community! As always, let us know if you’d like to host a future #HITsm chat or if you know someone you think we should invite to host.

If you’re searching for the latest #HITsm chat, you can always find the latest #HITsm chat and schedule of chats here.

Decommissioning Legacy EHRs

Posted on November 5, 2018 I Written By

The following is a guest blog post by Sudhakar Mohanraj, Founder and CEO, Triyam.

Every product has a lifecycle. The lifecycle of Electronic Health Record (EHR) software begins when it is implemented at your facility and ends when it’s no longer in use. When a facility decides to move to a new EHR, it’s natural to focus planning around the new software system.  However, not considering the legacy EHR can leave you wondering what should happen to all of the historical patient financial and medical data. You have many choices. This article will discuss some of the challenges and options that will influence your cost, legal compliance, and stakeholder satisfaction.

Three common mistakes to avoid when moving to a new EHR

  1. Hanging on to the legacy EHR

Some say: “we will worry about shutting down the old system later after the new EHR is up and going.” Taking that path is risky and expensive.

Consider the cost. Until you get all your historical data off the legacy system, you need to pay vendors licensing and support fees. You may infrequently be using the old system, which makes these fees particularly unwarranted.  In addition, you continue to pay your employees to operate and maintain the old system.

To learn more about retiring Legacy EHRs register for this free live webinar. Industry experts will share Key lessons and Best Practices on data management strategies for EHR system replacements. You can also get answers to your questions about any specific requirements.

Some say, “I will stop paying my old vendor.  I don’t need any more updates or support.” However, sooner or later, hardware and software will break or become incompatible to newer technology. Older systems are an easy target for hackers and thieves.

Over time, your employees will forget passwords, how to navigate the old system or leave for other jobs. Then, when you, a patient, or your boss needs some report from the old system, you are caught short. Over time, data retained on an old, unsupported, infrequently used system increases the risk of being lost, stolen, corrupted, and not accessible by newer technology.

Bottom line: keeping an old, infrequently used system will needlessly eat up your time and money.

  1. Migrating all historical data from the legacy system to the new EHR

Some facilities are surprised to learn that the new EHR vendor will not convert all the historical data to the new computer system.

The new system is organized differently than the legacy system with different data elements and structures. There is never a one-to-one match on data mapping between the old and new systems.

It is difficult to validate the accuracy and completeness of data you want to import from the old system. The new EHR vendor doesn’t want to risk starting with an inaccurate database.

This is a golden opportunity to start with a clean slate. For example, you can take this time to reorganize, re-categorize, re-word codes, and tables. Now is the time to set up master files properly, and to make the system more efficient.

The new EHR vendor will lobby for you to start with a clean slate and populate the new database with only current patients, current balances, and current information.

  1. Ignoring Legal Compliance Requirements

Federal and state laws require healthcare facilities to retain medical and financial reports for 5 to 15 years and make these reports available to patients and others upon request. Keeping these records will help to avoid penalties, fines, and loss of certifications. Consult your compliance office, accountant, and HIPAA director to know Federal, IRS, and state-specific requirements.

Use this Data retention tool to find the retention requirements for your state.

Why data archival is an excellent choice

What are the best practices to deal with historical data? Data from the old system needs to be organized in a safe, secure place so that the information can be found and made readily available to those who need it in a timely fashion. In other words, it needs to be archived.

An archive is a separate system from your new EHR. It contains all your historical data and reports. When users sign into the archive program, depending on their user rights, they may see all or some of the historical reports. The most common functions of the archive system include:

  • Search and query clinical and financial data for “Continuity of Care.
  • Download, view, print, and share reports for “Release of Information.

Archival is a new concept. KLAS research is creating a new product category for this.  Listen to this on-demand webinar from the head of EHR Archive studies at KLAS Research.

In the archive, you can see all patients and their previous charts, medications, treatments, billings, insurance claims, payments, and more.  You will also see the historical vendor, employee, and accounting records.

What type of data goes to the archive? All sorts. You can retain discrete data or non-discrete data, structured data (like SQL, XML, CCDA), or unstructured data that is logically grouped and presented in a human-readable form like pdf reports, Excel spreadsheets, CCD, jpeg, or mp3 files.

Mergers and data consolidation

Archival is essential even when there isn’t a transition to new EHR. During a merger, the new entity frequently wants to consolidate patient financial and clinical data from multiple legacy systems into a common platform. Data archiving may be the best solution for dealing with multiple EMR/EHRs. Archival is less expensive than complex conversion and transformation efforts. Besides lower costs, it allows users to research on consolidated data using business intelligence and analytics tools running on one common unified database.

Outsourcing and vendor selection

Outsourcing has become an increasingly popular option for archival solutions for three reasons – cost, experience, and convenience. IT managers are already stretched to limits of time, resources, and budget.  Outside vendors can save the day by offering services for less cost.

When searching for an archival vendor, consider the following:

  • Experience in extracting data from your legacy systems which are no longer supported
  • Complete turnkey solutions – planning, pilot testing, data conversion, user acceptance, and decommissioning
  • Archival product features and ease of use
  • Great customer references
  • Cost of archiving should only be a fraction of the cost of retaining legacy system

The number one failure when implementing a new EHR is procrastinating the archival of legacy data. Hopefully, you can use a few of these ideas to maximize the benefits of your historical data, minimize costs, and best serve your user constituents.

About Triyam
Triyam delivers expert solutions in EMR / EHR Data Conversion and Archival.

Triyam’s data conversion services help hospitals and clinics to freely migrate from one EHR vendor to another without losing any historical patient data. Triyam’s EHR archival product, Fovea is a vendor neutral, innovative and intuitive platform to store all your legacy data. Fovea includes a powerful search engine and extensive reporting for Business Intelligence and Analytics. Triyam is a proud sponsor of Healthcare Scene.

Scripps Research Translational Institute Partners To Develop AI Applications

Posted on November 2, 2018 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

The Scripps Research Translational Institute has agreed to work with graphics processing unit-maker NVIDIA to support the development of AI applications. The partners plan to forge AI and deep learning best practices, tools and infrastructure tailored to supporting the AI application development process.

In collaboration with NVIDIA, Scripps will establish a center of excellence for artificial intelligence in genomics and digital sensors. According to Dr. Eric Topol, the Institute’s founder and director, AI should eventually improve accuracy, efficiency, and workflow in medical practices. This is especially true of the data inputs from sensors and sequencing, he said in an NVIDIA blog item on the subject.

Scripps is already a member of a unique data-driven effort known as the “All of Us Research Program,” which is led by the National Institutes of Health. This program, which collects data on more than 1 million US participants, looks at the intersection of biology, genetics, environment, data science, and computation. If successful, this research will expand the range of conditions that can be treated using precision medicine techniques.

NVIDIA, for its part, is positioned to play an important part in the initial wave of AI application rollouts. The company is a leader in producing performance chipsets popular with those who play high-end, processor-intensive gaming which it has recently applied to other processor intensive projects like blockchain. It now hopes its technology will form the core of systems designed to crunch the high volumes of data used in AI projects.

If NVIDIA can provide hardware that makes high-volume number-crunching less expensive and more efficient, it could establish an early lead in what is likely to be a very lucrative market. Given its focus on graphics processing, the hardware giant could be especially well-suited to dominate rapidly-emerging radiology AI applications.

We can certainly expect to see more partnerships like this file into place over the next year or two. Few if any IT vendors have enough scientific expertise in-house to make important gains in biotech AI, and few providers have enough excess IT talent available to leverage discoveries and data in this arena.

It will be interesting to see what AI applications development approaches emerge from such partnerships. Right now, much AI development and integration is being done on a one-off basis, but it’s likely these projects will become more systematized soon.