Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

Meaningful Use Audit Advice

Posted on January 30, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

In response to my post on Meaningful Use Audits and the Inconsistent Appeals Process, Todd Searls. Executive Director at Wide River LLC, offered this interesting meaningful use audit advice on LinkedIn:

We’ve assisted numerous clinics and hospitals through their audits, and you’re absolutely correct John. Those clinics that have the people and processes already in place, this ends up (most of the time), being a non -issue, just time consuming. However, we have clients that have undergone significant changes since 2011 and now that they are being audited, the changes are coming back to haunt them since tracking MU documentation through the changes may not have been the highest priority.

Even those clinics that have the right documentation are now finding that they shouldn’t just mail the documents in bulk to the auditors unless they’ve spent time creating a good summary document which clearly defines each and every appendix document being sent. Case in point, we had one clinic call us to help them with their appeal for a failed audit. When we engaged we spent a few hours trying to determine why they failed the audit since the documents they had on file to support their attestation were excellent. Then we reviewed how they sent them in (in just one mass mailing with no cover letter or explanation beyond a title for each document (ie, In Reference to Measure 2)).

Once we created a clear cover letter and resubmitted, they were notified very quickly that their appeal was successful. The clinic had mixed feelings – great that they passed, but unhappy about having to ‘mind-read’ the preferred format that the auditor was looking for. Right or wrong, many clinics are in the same place – frustrated with the process.

I don’t know anyone who enjoys an audit. However, an audit can at least be bearable if it’s clear what’s expected in the audit. I think we’re going to have a lot more stories about meaningful use audits coming down the pipe. Hopefully Todd’s advice helps some who run into a meaningful use audit.

CMS Listens to Those Calling for a 90 Day Meaningful Use Reporting Period

Posted on January 29, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I think that most of us in the industry figured this was just a matter of time, but it’s nice that we were right and CMS is working to modify the requirements and reporting periods for meaningful use. I imagine they heard all the many voices that were calling for a change to meaningful use stage 2 and it’s just taken them this long to work through the government process to make it a reality.

Before I act like this change is already in place, CMS was very specific in the wording of their announcement about their “intent to modify requirements for meaningful use” and their “intent to engage in rulemaking” in order to make these “intended” changes. Basically they’re saying that they can just change the rules. They have to go through the rule making process for these changes to go into effect. That said, I don’t think anyone doubts that this will make it through the rule making process.

Here’s the modifications that they’re proposing:

  1. Shortening the 2015 reporting period to 90 days to address provider concerns about their ability to fully deploy 2014 Edition software
  2. Realigning hospital reporting periods to the calendar year to allow eligible hospitals more time to incorporate 2014 Edition software into their workflows and to better align with other quality programs
  3. Modifying other aspects of the programs to match long-term goals, reduce complexity, and lessen providers’ reporting burden

They also added this interesting clarification and information about the meaningful use stage 3 proposed rule:

To clarify, we are working on multiple tracks right now to realign the program to reflect the progress toward program goals and be responsive to stakeholder input. Today’s announcement that we intend to pursue the changes to meaningful use beginning in 2015 through rulemaking, is separate from the forthcoming Stage 3 proposed rule that is expected to be released by early March. CMS intends to limit the scope of the Stage 3 proposed rule to the requirements and criteria for meaningful use in 2017 and subsequent years.

I think everyone will welcome a dramatic simplification of the meaningful use program. The above 3 changes will be welcome by everyone I know.

In the email announcement for this, they provided an explanation for why they’re doing these changes:

These proposed changes reflect the Department of Health and Human Services’ commitment to creating a health information technology infrastructure that:

  • Elevates patient-centered care
  • Improves health outcomes
  • Supports the providers who care for patients

Personally, I think they saw the writing on the wall and it wasn’t pretty. Many organizations were going to opt out of meaningful use stage 2. These changes were needed and necessary for many organizations to continue participating in meaningful use. They believe meaningful use will elevate patient-centered care, improve health outcomes, and support the providers who care for patients. I’m glad they finally chose to start the rulemaking process to make the changes. I think many that started meaningful use can still benefit from the rest of the incentive money and will be even happier to avoid the penalties.

What Is the Future for Rural Physicians? Is There One?

Posted on January 28, 2015 I Written By

Value based payments.  Value based care.  Meaningful  use.  Is there a place for an independent doctor in a suburban location?  This article says that these and all the technology to go with them along with physician acceptance is “Inevitable”.

I have four physicians.  I don’t see a place for them long term.  My first is my Internist.  A few years ago he was given a cell phone as a gift.  It does all he will ever want.  If it rings, he answers it.  If he has to make a call, he dials the number.  He has no computers in his office.  All his files are paper.  As a Doctor he is recognized as one of the best in the state. EHR is not in his future.  Phones, fax, copier suit him just fine.  The article that raised these questions for me was a report from Deloitte.  You might end up with some of the same questions after reading it. 

My second physician has been using EHR for as long as I have known him.  He has 2 offices and four other doctors working for him.  He needs the technology.  He hates it, upgrades only when he has to and would never do it again.  He is also recognizes as one of the best in the state.  His daughter is now in her residency and will join him next year.  My gut feel is that in 3-4 years he turns the business over to her, let’s her worry about it and sails off into the sunset.

My radiation oncologist was great.  He treated me 8 years ago.  My last visit with him was 4 years ago.  The company he worked for terminated him for not generating enough revenue.  His waiting room was always filled but with little to no wait.  His staff was great and could have easily made more money by moving to a large city.  They, like he, enjoyed the suburban life.  All were dumbfounded when he was terminated.  They also learned that for this big city practice, profit was the only incentive.  He’s in FL now, out in the sticks and owns his own practice.

Doctor #4 is a general surgeon.  He is probably the only one that could/would survive in the “inevitable market”.  His office is at the medical arts building at the local hospital.  There are 3 other surgeons in his practice.  He has a fairly up to date computer system,  though not in his location and not compatible with the hospitals new system.  I know that his definition of value based anything and mine differ.  On my last visit he kept me waiting for 45 minutes because lunch went longer than scheduled.  He’s all business.

For 3 of these 4 I see the choice of conforming and or selling out.  They are all rated in the top 25 physicians in the state.  They are not going to increase their patient base to increase revenue.

I am sure that Doctor #4 will succeed. He is all and only business.  He holds the purse strings for his practice and has absolutely no problem in spending whatever it takes for technology to increase profit.  As long as he doesn’t have to use it.

The area that I live in is not unique The hospital‘s area of reach is a bit under 60,000.  As part of that is a resort area, add another 10K for the summer months.  Is there a future for physicians like this?  If so, what will they need to do to stay viable?  Hire a business manager?  More nurse Practitioners?  Sell, retire or join together a form their own physician groups?  Any thoughts?

#HITMC Chat and Health IT Marketing and PR Conference Early Bird Registration Ends

Posted on January 27, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Today we held the first ever #HITMC (Healthcare IT Marketing and PR Community) Twitter chat. The turnout for the chat was amazing and it was so active I don’t think anyone could keep up. That’s pretty amazing for a first time chat. In case you missed it and are interested in health IT marketing and PR, here’s my tweet that links to the transcript:

I’m particularly interested to look back at the answer to question 3 on the chat which talks about the tools that people use to make their lives easier.

Here’s a look at the stats for the first HITMC chat:

All of this tells me that I should have started this twitter chat sooner. It’s amazing how a Twitter chat can really bring a community together. Plus, it always leads to interesting new connections that wouldn’t have happened otherwise. Tomorrow I’ll be participating in another new Twitter chat that’s focused on Health Information Governance. If that topic interests you, be sure to join us on #InfoTalk at Noon ET on January 28th.

We’re also 5 days away from the end of Early Bird Registration for the Health IT Marketing and PR Conference. Register now and save $500 off the registration price. Plus, as a reader of EMR and HIPAA, use the promo code “emrandhipaa” and you’ll save an extra $100. We’ve just started uploaded the speaker profiles for those who will be speaking at the event. It’s going to be a fantastic 2+ days of the best in healthcare IT marketing and PR. I can’t wait!

For those not interested in the above topics. Tomorrow we’ll be back with our regularly scheduled programming.

Speeding Sepsis Response by Integrating Key Technology

Posted on January 26, 2015 I Written By

Stephen Claypool, M.D., is Vice President of Clinical Development & Informatics, Clinical Solutions, with Wolters Kluwer Health and Medical Director of its Innovation Lab. He can be reached at steve.claypool@wolterskluwer.com.
Stephen Claypool - WKH
Three-week-old Jose Carlos Romero-Herrera was rushed to the ER, lethargic and unresponsive with a fever of 102.3. His mother watched helplessly as doctors, nurses, respiratory therapists and assorted other clinicians frantically worked to determine what was wrong with an infant who just 24 hours earlier had been healthy and happy.

Hours later, Jose was transferred to the PICU where his heart rate remained extremely high and his blood pressure dangerously low. He was intubated and on a ventilator. Seizures started. Blood, platelets, plasma, IVs, and multiple antibiotics were given. Still, Jose hovered near death.

CT scans, hourly blood draws and EEGs brought no answers. Despite all the data and knowledge available to the clinical team fighting for Jose’s life, it was two days before the word “sepsis” was uttered. By then, his tiny body was in septic shock. It had swelled to four times the normal size. The baby was switched from a ventilator to an oscillator. He received approximately 16 different IV antibiotics, along with platelets, blood, plasma, seizure medications and diuretics.

“My husband and I were overwhelmed at the equipment in the room for such a tiny little person. We were still in shock about how we’d just sat there and enjoyed him a few hours ago and now were being told that we may not be bringing him back home with us,” writes Jose’s mother, Edna, who shared the story of her baby’s 30-day ordeal as part of the Sepsis Alliance’s “Faces of Sepsis” series.

Jose ultimately survived. Many do not. Three-year-old Ivy Hayes went into septic shock and died after being sent home from the ER with antibiotics for a UTI. Larry Przybylski’s mother died just days after complaining of a “chill” that she suspected was nothing more than a 24-hour bug.

Sepsis is the body’s overwhelming, often-fatal immune response to infection. Worldwide, there are an estimated 8 million deaths from sepsis, including 750,000 in the U.S. At $20 billion annually, sepsis is the single most expensive condition treated in U.S. hospitals.

Hampering Efforts to Fight Sepsis

Two overarching issues hamper efforts to drive down sepsis mortality and severity rates.

First, awareness among the general population is surprisingly low. A recent study conducted by The Harris Poll on behalf of Sepsis Alliance found that just 44% of Americans had ever even heard of sepsis.

Second, the initial presentation of sepsis can be subtle and its common signs and symptoms are shared by multiple other illnesses. Therefore, along with clinical acumen, early detection requires the ability to integrate and track multiple data points from multiple sources—something many hospitals cannot deliver due to disparate systems and siloed data.

While the Sepsis Alliance focuses on awareness through campaigns including Faces of Sepsis and Sepsis Awareness Month, hospitals and health IT firms are focused on reducing rates by arming clinicians with the tools necessary to rapidly diagnose and treat sepsis at its earliest stages.

A primary clinical challenge is that sepsis escalates rapidly, leading to organ failure and septic shock, resulting in death in nearly 30 percent of patients. Every hour without treatment significantly raises the risk of death, yet early screening is problematic. Though much of the data needed to diagnose sepsis already reside within EHRs, most systems don’t have the necessary clinical decision support content or informatics functionality.

There are also workflow issues. Inadequate cross-shift communication, challenges in diagnosing sepsis in lower-acuity areas, limited financial resources and a lack of sepsis protocols and sepsis-specific quality metrics all contribute to this intractable issue.

Multiple Attack Points

Recognizing the need to attack sepsis from multiple angles, our company is testing a promising breakthrough in the form of POC Advisor™. The program is a holistic approach that integrates advanced technology with clinical change management to prevent the cascade of adverse events that occur when sepsis treatment is delayed.

This comprehensive platform is currently being piloted at Huntsville Hospital in Alabama and John Muir Medical Center in California. It works by leveraging EHR data and automated surveillance, clinical content and a rules engine driven by proprietary algorithms to begin the sepsis evaluation process. Mobile technology alerts clinical staff to evaluate potentially septic patients and determine a course of treatment based on their best clinical judgment.

For a truly comprehensive solution, it is necessary to evaluate specific needs at each hospital. That information is used to expand sepsis protocols and add rules, often hundreds of them, to improve sensitivity and specificity and reduce alert fatigue by assessing sepsis in complex clinical settings. These additional rules take into account comorbid medical conditions and medications that can cause lab abnormalities that may mimic sepsis. This helps to ensure alerts truly represent sepsis.

The quality of these alerts is crucial to clinical adoption. They must be both highly specific and highly sensitive in order to minimize alert fatigue. In the case of this specific system, a 95% specificity and sensitivity rating has been achieved by constructing hundreds of variations of sepsis rules. For example, completely different rules are run for patients with liver disease versus those with end-stage renal disease. Doing so ensures clinicians only get alerts that are helpful.

Alerts are also coupled with the best evidence-based recommendations so the clinical staff can decide which treatment path is most appropriate for a specific patient.

The Human Element

To address the human elements impacting sepsis rates, the system in place includes clinical change management to develop best practices, including provider education and screening tools and protocols for early sepsis detection. Enhanced data analytics further manage protocol compliance, public reporting requirements and real-time data reporting, which supports system-wide best practices and performance improvement.

At John Muir, the staff implemented POC Advisor within two medical/surgical units for patients with chronic kidney disease and for oncology patient populations. Four MEDITECH interfaces sent data to the platform, including lab results, pharmacy orders, Admit Discharge Transfer (ADT) and vitals/nursing documentation. A clinical database was created from these feeds, and rules were applied to create the appropriate alerts.

Nurses received alerts on a VoIP phone and then logged into the solution to review the specifics and determine whether they agree with the alerts based on their clinical training. The system prompted the nursing staff to respond to each one, either through acknowledgement or override. If acknowledged, suggested guidance regarding the appropriate next steps was provided, such as alerting the physician or ordering diagnostic lactate tests, based on the facility’s specific protocols. If alerts were overridden, a reason had to be entered, all of which were logged, monitored and reported. If action was not taken, repeat alerts were fired, typically within 10 minutes. If repeat alerts were not acted upon, they were escalated to supervising personnel.

Over the course of the pilot, the entire John Muir organization benefited from significant improvements on several fronts:

  • Nurses were able to see how data entered into the EHR was used to generate alerts
  • Data could be tracked to identify clinical process problems
  • Access to clinical data empowered the quality review team
  • Nurses reported being more comfortable communicating quickly with physicians based on guidance from the system and from John Muir’s standing policies

Finally, physicians reported higher confidence in the validity of information relayed to them by the nursing staff because they knew it was being communicated based on agreed upon protocols.

Within three months, John Muir experienced significant improvements related to key sepsis compliance rate metrics. These included an 80% compliance with patient screening protocols, 90% lactate tests ordered for patients who met screening criteria and 75% initiation of early, goal-directed therapy for patients with severe sepsis.

Early data from Huntsville Hospital is equally promising, including a 37% decline in mortality on patient floors where POC Advisor was implemented. Thirty-day readmissions have declined by 22% on screening floors, and data suggest documentation improvements resulting from the program may positively impact reimbursement levels.

This kind of immediate outcome is generating excitement at the pilot hospitals. Though greater data analysis is still necessary, early indications are that a multi-faceted approach to sepsis holds great promise for reducing deaths and severity.

The Next Generation Tech Kids

Posted on January 23, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Today I had the amazing opportunity to volunteer at my kids school. They make it a big deal for dad’s to volunteer at the school and my kids absolutely adore having their dad at school with them. We have a tradition that I go and spend the day at school with my kids on their birthdays. It’s pretty awesome and I might have even shed a tear or two. (Side Note: Check out my new Daddy Blog for cute pics of my kids)

However, that’s not the point of this post. It turns out today was testing day for a bunch of my kids (I have 3 in elementary school). What was amazing is that all of the test were administered on a computer. Yes, even my 5 year old kindergartner was taking his test on the computer. In fact the teacher told me, “It’s kind of hard because they don’t even really know how to type.”

Whether this is a good idea or not, is a topic for an education blog. However, I’ve written before about the next generation of digital natives and the impact they’ll have on healthcare and EHR. If we look a little further out, my 5 year old won’t even be able to comprehend the idea of a paper chart. It will be so ridiculous to him.

I’m still processing what this will mean to healthcare IT and to society in general. As I think back on the thousands of blog posts I’ve written about adopting EHR, I can think of many that will sound ridiculous even 5-10 years from now. That has me very excited. Not that my content is no longer useful (unless you enjoy Health IT history). I’m excited that a whole sea change is going to happen in how we want technology applied to healthcare.

No doubt, it’s not without some risk. I’ve heard many argue that the next generation doesn’t care about privacy. Personally I’ve seen quite the opposite. The next generation has a very sophisticated approach to privacy. They know when and where to share something based on who and what they want to see it. It’s the older generation that has a problem knowing exactly where something should be shared and where it shouldn’t. That’s not to say that some young kids don’t make mistakes. They do, but most are quite aware of where something is being shared. It’s why so many kids use snapchat.

What do you think of the coming generations of technology savvy people? What benefits will they bring? What challenges will we face? Are you excited, scared, nervous?

Beware: Don’t Buy In to Myths about Data Security and HIPAA Compliance

Posted on January 22, 2015 I Written By

The following is a guest blog post by Mark Fulford, Partner in LBMC’s Security & Risk Services practice group.
Mark Fulford
Myths abound when it comes to data security and compliance. This is not surprising—HIPAA covers a lot of ground and many organizations are left to decide on their own how to best implement a compliant data security solution. A critical first step in putting a compliant data security solution in place is separating fact from fiction.  Here are four common misassumptions you’ll want to be aware of:

Myth #1: If we’ve never had a data security incident before, we must be doing OK on compliance with the HIPAA Security Rule.

It’s easy to fall into this trap. Not having had an incident is a good start, but HIPAA requires you to take a more proactive stance. Too often, no one is dedicated to monitoring electronic protected health information (ePHI) as prescribed by HIPAA. Data must be monitored—that is, someone must be actively reviewing data records and security logs to be on the lookout for suspicious activity.

Your current IT framework most likely includes a firewall and antivirus/antimalware software, and all systems have event logs. These tools collect data that too often go unchecked. Simply assigning someone to review the data you already have will greatly improve your compliance with HIPAA monitoring requirements, and more importantly, you may discover events and incidents that require your attention.

Going beyond your technology infrastructure, your facility security, hardcopy processing, workstation locations, portable media, mobile device usage and business associate agreements all need to be assessed to make sure they are compliant with HIPAA privacy and security regulations. And don’t forget about your employees. HIPAA dictates that your staff is trained (with regularly scheduled reminders) on how to handle PHI appropriately.

Myth #2: Implementing a HIPAA security compliance solution will involve a big technology spend.

This is not necessarily the case.  An organization’s investment in data security solutions can vary, widely depending on its size, budget and the nature of its transactions. The Office for Civil Rights (OCR) takes these variables into account—certainly, a private practice will have fewer resources to divert to security compliance than a major corporation. As long as you’ve justified each decision you’ve made about your own approach to compliance with each of the standards, the OCR will take your position into account if you are audited.

Most likely, you already have a number of appropriate technical security tools in place necessary to meet compliance. The added expense will more likely be associated with administering your data security compliance strategy.

Myth #3: We’ve read the HIPAA guidelines and we’ve put a compliance strategy in place. We must be OK on compliance.

Perhaps your organization is following the letter of the law. Policies and procedures are in place, and your staff is well-trained on how to handle patient data appropriately. By all appearances, you are making a good faith effort to be compliant.

But a large part of HIPAA compliance addresses how the confidentiality, integrity, and availability of ePHI is monitored in the IT department. If no one on the team has been assigned to monitor transactions and flag anomalies, all of your hard work at the front of the office could be for naught.

While a ‘check the box’ approach to HIPAA compliance might help if you get audited, unless it includes the ongoing monitoring of your system, your patient data may actually be exposed.

Myth #4: The OCR won’t waste their time auditing the ‘little guys.’ After all, doesn’t the agency have bigger fish to fry?

This is simply not true. Healthcare organizations of all sizes are eligible for an audit. Consider this cautionary tale: as a result of a reported incident, a dermatologist in Massachusetts was slapped with a $150,000 fine when an employee’s thumb drive was stolen from a car.

Fines for non-compliance can be steep, regardless of an organization’s size. If you haven’t done so already, now might be a good time to conduct a risk assessment and make appropriate adjustments. The OCR won’t grant you concessions just because you’re small, but they will take into consideration a good faith effort to comply.

Data Security and HIPAA Compliance: Make No Assumptions

As a provider, you are probably aware that the audits are starting soon, but perhaps you aren’t quite sure what that means for you. Arm yourself with facts. Consult with outside sources if necessary, but be aware that the OCR is setting the bar higher for healthcare organizations of all sizes. You might want to consider doing this, too. Your business—and your patients—are counting on it.

About Mark Fulford
Mark Fulford is a Partner in LBMC’s Security & Risk Services practice group.  He has over 20 years of experience in information systems management, IT auditing, and security.  Marks focuses on risk assessments and information systems auditing engagements including SOC reporting in the healthcare sector.  He is a Certified Information Systems Auditor (CISA) and Certified Information Systems Security Professional (CISSP).   LBMC is a top 50 Accounting & Consulting firm based in Brentwood, Tennessee.

Digital Health at CES Wrap Up Video

Posted on January 21, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

CES 2015 is now in the headlights. One person I talked to said they thought that the event was missing some of the excitement of previous years. I disagreed with him. I thought it was more exciting than previous years. Although, my excitement comes from the entrepreneurs and the Digital Health space. If you look at the larger CES floor with the massive million dollar booths, it was lacking some luster. Of course, with the size of CES, it’s easy to understand why two people could have very different experiences.

If you’re interested about what else I found at CES, I sat down with Dr. Nick van Terheyden, CMIO at Nuance, to talk about our experiences at CES 2015 and some of the takeaways from what we saw. I think you’ll enjoy this CES 2015 video chat below:

Defining the Legal Health Record, Ensuring Quality Health Data, and Managing a Part-Paper Part-Electronic Record – Healthcare Information Governance

Posted on January 20, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

This post is part of Iron Mountain’s Healthcare Information Governance: Big Picture Predictions and Perspectives Series which looks at the key trends impacting Healthcare Information Governance. Be sure to check out all the entries in this series.

Healthcare information governance (IG) has been important ever since doctors started tracking their patients in paper charts. However, over the past few years, adoption of EHR and other healthcare IT systems has exploded and provided a myriad of new opportunities and challenges associated with governance of a healthcare organization’s information.

Three of the most important health information governance challenges are:
1. Defining the legal health record
2. Ensuring quality health data
3. Managing a part-paper, part-electronic record

Defining the Legal Health Record
In the paper chart world, defining the legal health record was much easier. As we’ve shifted to an electronic world, the volume of data that’s stored in these electronic systems is so much greater. This has created a major need to define what your organization considers the legal health record.

The reality is that each organization now has to define its own legal health record based on CMS and accreditation guidelines, but also based on the specifics of their operation (state laws, EHR options, number of health IT systems, etc). The legal health record will only be a subset of the data that’s being stored by an EHR or other IT system and you’ll need to involve a wide group of people from your organization to define the legal health record.

Doing so is going to become increasingly important. Without a clearly defined legal health record, you’re going to produce an inconsistent release of information. This can lead to major liability issues in court cases where you produce inconsistent records, but it’s also important to be consistent when releasing health information to other doctors or even auditors.

One challenge we face in this regard is ensuring that EHR vendors provide a consistent and usable data output. A lot of thought has been put into how data is inputted into the EHR, but not nearly as much effort has been put into the way an EHR outputs that data. This is a major health information governance challenge that needs to be addressed. Similarly, most EHR vendors haven’t put much thought and effort into data retention either. Retention policies are an important part of defining your legal health record, but your policy is subject to the capabilities of the EHR.

Working with your EHR and other healthcare IT vendors to ensure they can produce a consistent legal health record is one strategic imperative that every healthcare organization should have on their list.

Ensuring Quality Health Data
The future of healthcare is very much going to be data driven. Payments to ACO organizations are going to depend on data. The quality of care you provide using Clinical Decision Support (CDS) systems is going to rely on the quality of data being used. Organizations are going to have new liability concerns that revolve around their organization’s data quality. Real time data interoperability is going to become a reality and everyone’s going to see everyone else’s data without a middleman first checking and verifying the quality of the data before it’s sent.

A great health information governance program led by a clinical documentation improvement (CDI) program is going to be a key first step for every organization. Quality data doesn’t happen over night, but requires a concerted effort over time. Organization need to start now if they want to be successful in the coming data driven healthcare world.

Managing a Part-Paper Part-Electronic Record
The health information world is becoming infinitely more complex. Not only do you have new electronic systems that store massive amounts of data, but we’re still required to maintain legacy systems and those old paper charts. Each of these requires time and attention to manage properly.

While we’d all love to just turn off legacy systems and dispose of old paper charts, data retention laws often mean that both of these will be part of every healthcare organization for many years to come. Unfortunately, most health IT project plans don’t account for ongoing management of these old but important data sources. This inattention often results in increased costs and risks associated with these legacy systems and paper charts.

It should be strategically important for every organization to have a sound governance plan for both legacy IT systems and paper charts. Ignorance is not bliss when one of these information sources is breached because your organization had “forgotten” about them.

The future of reimbursement, costs, quality of care, and liability in healthcare are all going to be linked to an organization’s data. Making sure your data governance house is in order is going to be a major component in the success or failure of your organization. A good place to start is defining the legal health record, ensuring quality health data, and managing a part-paper part-electronic record.

Join our Twitter Chat: “Healthcare IG Predictions & Perspectives”

On January 28th at 12:00 pm Eastern, @IronMtnHealth is hosting a Twitter chat using #InfoTalk to further the dialog. If you have been involved in governance-related projects, we’d love to have you join. What IG initiatives have shown success for you? How have you overcome any obstacles? What do you see as the future of IG? Keep the conversation going during our “Healthcare IG Predictions & Perspectives” #InfoTalk at 12pm Eastern on January 28th.

The Value of an Integrated Specialty EHR Approach

Posted on January 19, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

As many of you know, I’ve long been an advocate for the specialty specific EHR. There are just tremendous advantages in having an EHR that’s focused only on your specialty. Then, you don’t get things like child growth charts cluttering your EHR when you don’t see any children. Or taken the other way, you have child growth charts that are designed specifically for a pediatrician. This can be applied across pretty much every industry.

The reason that many organizations don’t go with a specialty specific EHR is usually because they’re a large multi specialty organization. These organizations don’t want to have 30 different EHR vendors that they have to support. Therefore, in their RFP they basically exclude specialty specific EHR vendors from their EHR selection process.

I understand from an IT support perspective and EHR implementation perspective how having 30 different EHR implementation would be a major challenge. However, it’s also a challenge to try and get one EHR vendor to work for 30+ specialties as well. Plus, the long term consequence is physician and other EHR user dissatisfaction using an EHR that wasn’t designed for their specialty. The real decision these organizations are making is whether they want to put the burden on the IT staff (ie. supporting multiple EHRs) or whether they want to put the burden on the doctors (ie. using an EHR that doesn’t meet their needs). In large organizations, it seems that they’re making the decision to put the burden on the doctors as opposed to the IT staff. Although, I don’t think many organizations realize that this is the choice they’re making.

Specialty EHR vendor, gMed, recenlty put out a whitepaper which does an analysis and a kind of case study on the differences between a integrated GI practice and a non-integrated GI practice. In this case, they’re talking about an EHR that’s integrated with an ambulatory surgery center and one that’s not. That’s a big deal for a specialty like GI. You can download the free whitepaper to get all the juicy details and differences between an integrated GI practice and one that’s not.

I’ve been seeing more and more doctors starting to talk about their displeasure with their EHR. I think much of that displeasure comes thanks to meaningful use and reimbursement requirements, but I also think that many are suffering under an EHR that really doesn’t understand their specialty. From my experience those EHR vendors that claim to support every specialty, that usually consists of one support rep for that specialty and a few months programming sprint to try and provide something special for that specialty. That’s very different than a whole team of developers and every customer support person at the company devoted to a specialty.

I’m not saying that an EHR can’t do more than one specialty, but doing 5 somewhat related specialties is still very different than trying to do the 40+ medical specialties with one interface. One challenge with the best of breed approach is that there are some specialties which don’t have an EHR that’s focused just on them. In that case, you may have to use the every specialty EHR.

What’s clear to me is that most large multi specialty organizations are choosing the all-in-one EHR systems in their offices. I wonder if force feeding an EHR into a specialty where it doesn’t fit is going to eventually lead to a physician revolt back to specialty specific EHRs. Physician dissatisfaction, liability issues, and improved interoperability could make the best of breed approach much more attractive to even the large organizations. Even if it means they back into a best of breed approach after trying the one-size-fits all approach to EHR.

I’ll be interested to watch this dynamic playing out. Plus, you have the specialty doctors coming together in mega groups in order to combat against this as well. What do you think is going to happen with specialty EHR? Should organizations be doing a best of breed approach or the one-size-fits all EHR? What are the consequences (good and bad) of either direction?

Full Disclosure: gMed is an advertiser on this site.