Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

Meaningful Use Audit Advice

Posted on January 30, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

In response to my post on Meaningful Use Audits and the Inconsistent Appeals Process, Todd Searls. Executive Director at Wide River LLC, offered this interesting meaningful use audit advice on LinkedIn:

We’ve assisted numerous clinics and hospitals through their audits, and you’re absolutely correct John. Those clinics that have the people and processes already in place, this ends up (most of the time), being a non -issue, just time consuming. However, we have clients that have undergone significant changes since 2011 and now that they are being audited, the changes are coming back to haunt them since tracking MU documentation through the changes may not have been the highest priority.

Even those clinics that have the right documentation are now finding that they shouldn’t just mail the documents in bulk to the auditors unless they’ve spent time creating a good summary document which clearly defines each and every appendix document being sent. Case in point, we had one clinic call us to help them with their appeal for a failed audit. When we engaged we spent a few hours trying to determine why they failed the audit since the documents they had on file to support their attestation were excellent. Then we reviewed how they sent them in (in just one mass mailing with no cover letter or explanation beyond a title for each document (ie, In Reference to Measure 2)).

Once we created a clear cover letter and resubmitted, they were notified very quickly that their appeal was successful. The clinic had mixed feelings – great that they passed, but unhappy about having to ‘mind-read’ the preferred format that the auditor was looking for. Right or wrong, many clinics are in the same place – frustrated with the process.

I don’t know anyone who enjoys an audit. However, an audit can at least be bearable if it’s clear what’s expected in the audit. I think we’re going to have a lot more stories about meaningful use audits coming down the pipe. Hopefully Todd’s advice helps some who run into a meaningful use audit.

CMS Listens to Those Calling for a 90 Day Meaningful Use Reporting Period

Posted on January 29, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I think that most of us in the industry figured this was just a matter of time, but it’s nice that we were right and CMS is working to modify the requirements and reporting periods for meaningful use. I imagine they heard all the many voices that were calling for a change to meaningful use stage 2 and it’s just taken them this long to work through the government process to make it a reality.

Before I act like this change is already in place, CMS was very specific in the wording of their announcement about their “intent to modify requirements for meaningful use” and their “intent to engage in rulemaking” in order to make these “intended” changes. Basically they’re saying that they can just change the rules. They have to go through the rule making process for these changes to go into effect. That said, I don’t think anyone doubts that this will make it through the rule making process.

Here’s the modifications that they’re proposing:

  1. Shortening the 2015 reporting period to 90 days to address provider concerns about their ability to fully deploy 2014 Edition software
  2. Realigning hospital reporting periods to the calendar year to allow eligible hospitals more time to incorporate 2014 Edition software into their workflows and to better align with other quality programs
  3. Modifying other aspects of the programs to match long-term goals, reduce complexity, and lessen providers’ reporting burden

They also added this interesting clarification and information about the meaningful use stage 3 proposed rule:

To clarify, we are working on multiple tracks right now to realign the program to reflect the progress toward program goals and be responsive to stakeholder input. Today’s announcement that we intend to pursue the changes to meaningful use beginning in 2015 through rulemaking, is separate from the forthcoming Stage 3 proposed rule that is expected to be released by early March. CMS intends to limit the scope of the Stage 3 proposed rule to the requirements and criteria for meaningful use in 2017 and subsequent years.

I think everyone will welcome a dramatic simplification of the meaningful use program. The above 3 changes will be welcome by everyone I know.

In the email announcement for this, they provided an explanation for why they’re doing these changes:

These proposed changes reflect the Department of Health and Human Services’ commitment to creating a health information technology infrastructure that:

  • Elevates patient-centered care
  • Improves health outcomes
  • Supports the providers who care for patients

Personally, I think they saw the writing on the wall and it wasn’t pretty. Many organizations were going to opt out of meaningful use stage 2. These changes were needed and necessary for many organizations to continue participating in meaningful use. They believe meaningful use will elevate patient-centered care, improve health outcomes, and support the providers who care for patients. I’m glad they finally chose to start the rulemaking process to make the changes. I think many that started meaningful use can still benefit from the rest of the incentive money and will be even happier to avoid the penalties.

What Is the Future for Rural Physicians? Is There One?

Posted on January 28, 2015 I Written By

Value based payments.  Value based care.  Meaningful  use.  Is there a place for an independent doctor in a suburban location?  This article says that these and all the technology to go with them along with physician acceptance is “Inevitable”.

I have four physicians.  I don’t see a place for them long term.  My first is my Internist.  A few years ago he was given a cell phone as a gift.  It does all he will ever want.  If it rings, he answers it.  If he has to make a call, he dials the number.  He has no computers in his office.  All his files are paper.  As a Doctor he is recognized as one of the best in the state. EHR is not in his future.  Phones, fax, copier suit him just fine.  The article that raised these questions for me was a report from Deloitte.  You might end up with some of the same questions after reading it. 

My second physician has been using EHR for as long as I have known him.  He has 2 offices and four other doctors working for him.  He needs the technology.  He hates it, upgrades only when he has to and would never do it again.  He is also recognizes as one of the best in the state.  His daughter is now in her residency and will join him next year.  My gut feel is that in 3-4 years he turns the business over to her, let’s her worry about it and sails off into the sunset.

My radiation oncologist was great.  He treated me 8 years ago.  My last visit with him was 4 years ago.  The company he worked for terminated him for not generating enough revenue.  His waiting room was always filled but with little to no wait.  His staff was great and could have easily made more money by moving to a large city.  They, like he, enjoyed the suburban life.  All were dumbfounded when he was terminated.  They also learned that for this big city practice, profit was the only incentive.  He’s in FL now, out in the sticks and owns his own practice.

Doctor #4 is a general surgeon.  He is probably the only one that could/would survive in the “inevitable market”.  His office is at the medical arts building at the local hospital.  There are 3 other surgeons in his practice.  He has a fairly up to date computer system,  though not in his location and not compatible with the hospitals new system.  I know that his definition of value based anything and mine differ.  On my last visit he kept me waiting for 45 minutes because lunch went longer than scheduled.  He’s all business.

For 3 of these 4 I see the choice of conforming and or selling out.  They are all rated in the top 25 physicians in the state.  They are not going to increase their patient base to increase revenue.

I am sure that Doctor #4 will succeed. He is all and only business.  He holds the purse strings for his practice and has absolutely no problem in spending whatever it takes for technology to increase profit.  As long as he doesn’t have to use it.

The area that I live in is not unique The hospital‘s area of reach is a bit under 60,000.  As part of that is a resort area, add another 10K for the summer months.  Is there a future for physicians like this?  If so, what will they need to do to stay viable?  Hire a business manager?  More nurse Practitioners?  Sell, retire or join together a form their own physician groups?  Any thoughts?

Speeding Sepsis Response by Integrating Key Technology

Posted on January 26, 2015 I Written By

Stephen Claypool, M.D., is Vice President of Clinical Development & Informatics, Clinical Solutions, with Wolters Kluwer Health and Medical Director of its Innovation Lab. He can be reached at steve.claypool@wolterskluwer.com.
Stephen Claypool - WKH
Three-week-old Jose Carlos Romero-Herrera was rushed to the ER, lethargic and unresponsive with a fever of 102.3. His mother watched helplessly as doctors, nurses, respiratory therapists and assorted other clinicians frantically worked to determine what was wrong with an infant who just 24 hours earlier had been healthy and happy.

Hours later, Jose was transferred to the PICU where his heart rate remained extremely high and his blood pressure dangerously low. He was intubated and on a ventilator. Seizures started. Blood, platelets, plasma, IVs, and multiple antibiotics were given. Still, Jose hovered near death.

CT scans, hourly blood draws and EEGs brought no answers. Despite all the data and knowledge available to the clinical team fighting for Jose’s life, it was two days before the word “sepsis” was uttered. By then, his tiny body was in septic shock. It had swelled to four times the normal size. The baby was switched from a ventilator to an oscillator. He received approximately 16 different IV antibiotics, along with platelets, blood, plasma, seizure medications and diuretics.

“My husband and I were overwhelmed at the equipment in the room for such a tiny little person. We were still in shock about how we’d just sat there and enjoyed him a few hours ago and now were being told that we may not be bringing him back home with us,” writes Jose’s mother, Edna, who shared the story of her baby’s 30-day ordeal as part of the Sepsis Alliance’s “Faces of Sepsis” series.

Jose ultimately survived. Many do not. Three-year-old Ivy Hayes went into septic shock and died after being sent home from the ER with antibiotics for a UTI. Larry Przybylski’s mother died just days after complaining of a “chill” that she suspected was nothing more than a 24-hour bug.

Sepsis is the body’s overwhelming, often-fatal immune response to infection. Worldwide, there are an estimated 8 million deaths from sepsis, including 750,000 in the U.S. At $20 billion annually, sepsis is the single most expensive condition treated in U.S. hospitals.

Hampering Efforts to Fight Sepsis

Two overarching issues hamper efforts to drive down sepsis mortality and severity rates.

First, awareness among the general population is surprisingly low. A recent study conducted by The Harris Poll on behalf of Sepsis Alliance found that just 44% of Americans had ever even heard of sepsis.

Second, the initial presentation of sepsis can be subtle and its common signs and symptoms are shared by multiple other illnesses. Therefore, along with clinical acumen, early detection requires the ability to integrate and track multiple data points from multiple sources—something many hospitals cannot deliver due to disparate systems and siloed data.

While the Sepsis Alliance focuses on awareness through campaigns including Faces of Sepsis and Sepsis Awareness Month, hospitals and health IT firms are focused on reducing rates by arming clinicians with the tools necessary to rapidly diagnose and treat sepsis at its earliest stages.

A primary clinical challenge is that sepsis escalates rapidly, leading to organ failure and septic shock, resulting in death in nearly 30 percent of patients. Every hour without treatment significantly raises the risk of death, yet early screening is problematic. Though much of the data needed to diagnose sepsis already reside within EHRs, most systems don’t have the necessary clinical decision support content or informatics functionality.

There are also workflow issues. Inadequate cross-shift communication, challenges in diagnosing sepsis in lower-acuity areas, limited financial resources and a lack of sepsis protocols and sepsis-specific quality metrics all contribute to this intractable issue.

Multiple Attack Points

Recognizing the need to attack sepsis from multiple angles, our company is testing a promising breakthrough in the form of POC Advisor™. The program is a holistic approach that integrates advanced technology with clinical change management to prevent the cascade of adverse events that occur when sepsis treatment is delayed.

This comprehensive platform is currently being piloted at Huntsville Hospital in Alabama and John Muir Medical Center in California. It works by leveraging EHR data and automated surveillance, clinical content and a rules engine driven by proprietary algorithms to begin the sepsis evaluation process. Mobile technology alerts clinical staff to evaluate potentially septic patients and determine a course of treatment based on their best clinical judgment.

For a truly comprehensive solution, it is necessary to evaluate specific needs at each hospital. That information is used to expand sepsis protocols and add rules, often hundreds of them, to improve sensitivity and specificity and reduce alert fatigue by assessing sepsis in complex clinical settings. These additional rules take into account comorbid medical conditions and medications that can cause lab abnormalities that may mimic sepsis. This helps to ensure alerts truly represent sepsis.

The quality of these alerts is crucial to clinical adoption. They must be both highly specific and highly sensitive in order to minimize alert fatigue. In the case of this specific system, a 95% specificity and sensitivity rating has been achieved by constructing hundreds of variations of sepsis rules. For example, completely different rules are run for patients with liver disease versus those with end-stage renal disease. Doing so ensures clinicians only get alerts that are helpful.

Alerts are also coupled with the best evidence-based recommendations so the clinical staff can decide which treatment path is most appropriate for a specific patient.

The Human Element

To address the human elements impacting sepsis rates, the system in place includes clinical change management to develop best practices, including provider education and screening tools and protocols for early sepsis detection. Enhanced data analytics further manage protocol compliance, public reporting requirements and real-time data reporting, which supports system-wide best practices and performance improvement.

At John Muir, the staff implemented POC Advisor within two medical/surgical units for patients with chronic kidney disease and for oncology patient populations. Four MEDITECH interfaces sent data to the platform, including lab results, pharmacy orders, Admit Discharge Transfer (ADT) and vitals/nursing documentation. A clinical database was created from these feeds, and rules were applied to create the appropriate alerts.

Nurses received alerts on a VoIP phone and then logged into the solution to review the specifics and determine whether they agree with the alerts based on their clinical training. The system prompted the nursing staff to respond to each one, either through acknowledgement or override. If acknowledged, suggested guidance regarding the appropriate next steps was provided, such as alerting the physician or ordering diagnostic lactate tests, based on the facility’s specific protocols. If alerts were overridden, a reason had to be entered, all of which were logged, monitored and reported. If action was not taken, repeat alerts were fired, typically within 10 minutes. If repeat alerts were not acted upon, they were escalated to supervising personnel.

Over the course of the pilot, the entire John Muir organization benefited from significant improvements on several fronts:

  • Nurses were able to see how data entered into the EHR was used to generate alerts
  • Data could be tracked to identify clinical process problems
  • Access to clinical data empowered the quality review team
  • Nurses reported being more comfortable communicating quickly with physicians based on guidance from the system and from John Muir’s standing policies

Finally, physicians reported higher confidence in the validity of information relayed to them by the nursing staff because they knew it was being communicated based on agreed upon protocols.

Within three months, John Muir experienced significant improvements related to key sepsis compliance rate metrics. These included an 80% compliance with patient screening protocols, 90% lactate tests ordered for patients who met screening criteria and 75% initiation of early, goal-directed therapy for patients with severe sepsis.

Early data from Huntsville Hospital is equally promising, including a 37% decline in mortality on patient floors where POC Advisor was implemented. Thirty-day readmissions have declined by 22% on screening floors, and data suggest documentation improvements resulting from the program may positively impact reimbursement levels.

This kind of immediate outcome is generating excitement at the pilot hospitals. Though greater data analysis is still necessary, early indications are that a multi-faceted approach to sepsis holds great promise for reducing deaths and severity.

Defining the Legal Health Record, Ensuring Quality Health Data, and Managing a Part-Paper Part-Electronic Record – Healthcare Information Governance

Posted on January 20, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

This post is part of Iron Mountain’s Healthcare Information Governance: Big Picture Predictions and Perspectives Series which looks at the key trends impacting Healthcare Information Governance. Be sure to check out all the entries in this series.

Healthcare information governance (IG) has been important ever since doctors started tracking their patients in paper charts. However, over the past few years, adoption of EHR and other healthcare IT systems has exploded and provided a myriad of new opportunities and challenges associated with governance of a healthcare organization’s information.

Three of the most important health information governance challenges are:
1. Defining the legal health record
2. Ensuring quality health data
3. Managing a part-paper, part-electronic record

Defining the Legal Health Record
In the paper chart world, defining the legal health record was much easier. As we’ve shifted to an electronic world, the volume of data that’s stored in these electronic systems is so much greater. This has created a major need to define what your organization considers the legal health record.

The reality is that each organization now has to define its own legal health record based on CMS and accreditation guidelines, but also based on the specifics of their operation (state laws, EHR options, number of health IT systems, etc). The legal health record will only be a subset of the data that’s being stored by an EHR or other IT system and you’ll need to involve a wide group of people from your organization to define the legal health record.

Doing so is going to become increasingly important. Without a clearly defined legal health record, you’re going to produce an inconsistent release of information. This can lead to major liability issues in court cases where you produce inconsistent records, but it’s also important to be consistent when releasing health information to other doctors or even auditors.

One challenge we face in this regard is ensuring that EHR vendors provide a consistent and usable data output. A lot of thought has been put into how data is inputted into the EHR, but not nearly as much effort has been put into the way an EHR outputs that data. This is a major health information governance challenge that needs to be addressed. Similarly, most EHR vendors haven’t put much thought and effort into data retention either. Retention policies are an important part of defining your legal health record, but your policy is subject to the capabilities of the EHR.

Working with your EHR and other healthcare IT vendors to ensure they can produce a consistent legal health record is one strategic imperative that every healthcare organization should have on their list.

Ensuring Quality Health Data
The future of healthcare is very much going to be data driven. Payments to ACO organizations are going to depend on data. The quality of care you provide using Clinical Decision Support (CDS) systems is going to rely on the quality of data being used. Organizations are going to have new liability concerns that revolve around their organization’s data quality. Real time data interoperability is going to become a reality and everyone’s going to see everyone else’s data without a middleman first checking and verifying the quality of the data before it’s sent.

A great health information governance program led by a clinical documentation improvement (CDI) program is going to be a key first step for every organization. Quality data doesn’t happen over night, but requires a concerted effort over time. Organization need to start now if they want to be successful in the coming data driven healthcare world.

Managing a Part-Paper Part-Electronic Record
The health information world is becoming infinitely more complex. Not only do you have new electronic systems that store massive amounts of data, but we’re still required to maintain legacy systems and those old paper charts. Each of these requires time and attention to manage properly.

While we’d all love to just turn off legacy systems and dispose of old paper charts, data retention laws often mean that both of these will be part of every healthcare organization for many years to come. Unfortunately, most health IT project plans don’t account for ongoing management of these old but important data sources. This inattention often results in increased costs and risks associated with these legacy systems and paper charts.

It should be strategically important for every organization to have a sound governance plan for both legacy IT systems and paper charts. Ignorance is not bliss when one of these information sources is breached because your organization had “forgotten” about them.

The future of reimbursement, costs, quality of care, and liability in healthcare are all going to be linked to an organization’s data. Making sure your data governance house is in order is going to be a major component in the success or failure of your organization. A good place to start is defining the legal health record, ensuring quality health data, and managing a part-paper part-electronic record.

Join our Twitter Chat: “Healthcare IG Predictions & Perspectives”

On January 28th at 12:00 pm Eastern, @IronMtnHealth is hosting a Twitter chat using #InfoTalk to further the dialog. If you have been involved in governance-related projects, we’d love to have you join. What IG initiatives have shown success for you? How have you overcome any obstacles? What do you see as the future of IG? Keep the conversation going during our “Healthcare IG Predictions & Perspectives” #InfoTalk at 12pm Eastern on January 28th.

The Value of an Integrated Specialty EHR Approach

Posted on January 19, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

As many of you know, I’ve long been an advocate for the specialty specific EHR. There are just tremendous advantages in having an EHR that’s focused only on your specialty. Then, you don’t get things like child growth charts cluttering your EHR when you don’t see any children. Or taken the other way, you have child growth charts that are designed specifically for a pediatrician. This can be applied across pretty much every industry.

The reason that many organizations don’t go with a specialty specific EHR is usually because they’re a large multi specialty organization. These organizations don’t want to have 30 different EHR vendors that they have to support. Therefore, in their RFP they basically exclude specialty specific EHR vendors from their EHR selection process.

I understand from an IT support perspective and EHR implementation perspective how having 30 different EHR implementation would be a major challenge. However, it’s also a challenge to try and get one EHR vendor to work for 30+ specialties as well. Plus, the long term consequence is physician and other EHR user dissatisfaction using an EHR that wasn’t designed for their specialty. The real decision these organizations are making is whether they want to put the burden on the IT staff (ie. supporting multiple EHRs) or whether they want to put the burden on the doctors (ie. using an EHR that doesn’t meet their needs). In large organizations, it seems that they’re making the decision to put the burden on the doctors as opposed to the IT staff. Although, I don’t think many organizations realize that this is the choice they’re making.

Specialty EHR vendor, gMed, recenlty put out a whitepaper which does an analysis and a kind of case study on the differences between a integrated GI practice and a non-integrated GI practice. In this case, they’re talking about an EHR that’s integrated with an ambulatory surgery center and one that’s not. That’s a big deal for a specialty like GI. You can download the free whitepaper to get all the juicy details and differences between an integrated GI practice and one that’s not.

I’ve been seeing more and more doctors starting to talk about their displeasure with their EHR. I think much of that displeasure comes thanks to meaningful use and reimbursement requirements, but I also think that many are suffering under an EHR that really doesn’t understand their specialty. From my experience those EHR vendors that claim to support every specialty, that usually consists of one support rep for that specialty and a few months programming sprint to try and provide something special for that specialty. That’s very different than a whole team of developers and every customer support person at the company devoted to a specialty.

I’m not saying that an EHR can’t do more than one specialty, but doing 5 somewhat related specialties is still very different than trying to do the 40+ medical specialties with one interface. One challenge with the best of breed approach is that there are some specialties which don’t have an EHR that’s focused just on them. In that case, you may have to use the every specialty EHR.

What’s clear to me is that most large multi specialty organizations are choosing the all-in-one EHR systems in their offices. I wonder if force feeding an EHR into a specialty where it doesn’t fit is going to eventually lead to a physician revolt back to specialty specific EHRs. Physician dissatisfaction, liability issues, and improved interoperability could make the best of breed approach much more attractive to even the large organizations. Even if it means they back into a best of breed approach after trying the one-size-fits all approach to EHR.

I’ll be interested to watch this dynamic playing out. Plus, you have the specialty doctors coming together in mega groups in order to combat against this as well. What do you think is going to happen with specialty EHR? Should organizations be doing a best of breed approach or the one-size-fits all EHR? What are the consequences (good and bad) of either direction?

Full Disclosure: gMed is an advertiser on this site.

Never Sell Your EHR Company – According to eCW Founder

Posted on January 16, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I recently came across an interesting article in Entrepreneur magazine authored by Girish Navani, CEO and Co-founder of eClinicalWorks. If you read this site, you know doubt are familiar with the quite popular eCW EHR software. In this article Girish gives some interesting insight into the future of eCW as a company:

After grad school, I set out to create my own version of my father’s bridge. After working many odd jobs developing software, I created credit check software for an acquaintance’s business. This made him a lot of money, which prompted me to ask (perhaps naively) for a share of the profit. I had developed a very successful facet of the company – didn’t I deserve it? His response surprised me, but I will never forget it. He said, “If you build something you like, don’t sell it.”

Twenty years later, I still remember my acquaintance’s advice. For that reason, my company, eClinicalWorks is, and always will be, a privately-held company. I have no interest in selling it, regardless of any offer I may get. In addition, we don’t use investor cash or spend money we don’t have.

This is not a philosophy that is unique to eCW. #1 on Epic’s list of principles is “Do not go public.” I imagine that Judy Faulkner (CEO of Epic) has a somewhat similar philosophy to Girish. There are certainly a lot of advantages to not going public and most of them get down to control. I’ll never forget when I heard one of the Marriott children talk about their decision to stay a private company. He said that Marriott would likely be a lot bigger if they had become a public company, but they would have lost a lot of the company culture if they’d chose to do so.

I imagine this is a similar feeling that Epic and eCW share. However, there’s also some accountability that comes with being a public company as well. It’s not easy for an organization to assess the financial well being of a private company. During the golden age of EHR which we just experienced, that hasn’t been an issue for either eCW or Epic. However, as we exit this golden age of EHR that was propped up by $36 billion in government stimulus money, the financial future may be quite different.

As in most things in life, there are pros and cons to staying private or going public. It’s interesting that two of the major EHR players (eCW and Epic) have made it clear that they have no interest in ever going public. We’ll see how that plays out long term.

Top 4 HIT Challenges and Opportunities for Healthcare Organizations in 2015 – Breakaway Thinking

Posted on January 15, 2015 I Written By

The following is a guest blog post by Mitchell Woll, Instructional Designer at The Breakaway Group (A Xerox Company). Check out all of the blog posts in the Breakaway Thinking series.
Mitchell Woll - The Breakaway Group
Healthcare organizations face numerous challenges in 2015: ICD-10 implementation, HIPAA compliance, new Meaningful Use objectives, and the Office of the National Coordinator’s (ONC) interoperability road map.  To adapt successfully, organizations must take advantage of numerous opportunities to prepare.

Healthcare leaders must thoroughly assess, prioritize, prepare, and execute in each area:

  1. Meaningful Use Stage 2 objectives require increased patient engagement and reporting for a full year before earning incentives.
  2. The ONC’s interoperability road map demands a new framework to achieve successful information flow between healthcare systems over the next ten years.
  3. There are 10 months left in which to prepare for the October 1 ICD-10 deadline.
  4. HIPAA compliance will be audited.

1. Meaningful Use
For those who have already implemented an EHR, Meaningful Use Stage 2 focuses new efforts on patient access to personal health data and emphasizes the exchange of health information between patient and providers. Stage 2 also imposes financial penalties for failure to meet requirements.

CMS’s latest deadline for Stage 2 extends through 2016, so healthcare organizations have additional time to fulfill Stage 2 requirements. Stage 3 requirements begin in 2017, so healthcare organizations should take the extra time to build interoperability and foster an internal culture of collaboration between providers and patients. For Stage 3, Medicare incentives will not apply in 2017 and EHR penalties will rise to 3 percent.

CMS has also proposed a 2015 EHR certification, which requests interoperability enhancement to support transitions of care.  Complying with this certification is voluntary, but provides the opportunity to become certified for Medicare and Medicaid EHR incentive programs at the same time.

Meaningful Use Stage 2 and the ONC roadmap require that 2015 efforts concentrate on interoperability. Healthcare organizations should prepare for health information exchange by focusing efforts on building patient portals and integrating communications by automating phone, text, and e-mail messages. After setting up successful exchange methods, healthcare organizations should train staff how to use patient portals. The delay in Stage 2 means providers have more time to become comfortable using the technology to correspond with patients. Hospitals should also educate patients about these resources, describing the benefits of collaboration between providers and patients. Positive collaboration and successful data exchange helps achieve desired health outcomes faster.

2. Interoperability
The three-year goal of the ONC’s 10-year roadmap is for providers and patients to be able to send, receive, find, and use basic health information. The six and ten-year goals then build on the initial objectives, improving interoperability into the future.

Congress has also shown initiative on promoting interoperability asking the ONC to investigate information blocking by EHRs. Most of the ONC’s roadmap for the next three years is similar to Meaningful Use Stage 2 goals.

Sixty-four percent of Americans do not use patient portals, so for 2015 healthcare organizations should focus on creating them, refining their workflows, and encouraging patients to use them. Additionally, 35 percent of patients said they are unaware of patient portals, while 31 percent said their physician has never mentioned them. Fifty-six percent of patients ages 55-64, and 46 percent of patients 65 and older, said they would access medical information more if it were available online. Hospitals need their own staff to use and promote patient portals in order to conquer the challenges of interoperability and Stage 2.

3. HIPAA Compliance
In 2015, the Office of the Inspector General (OIG) will audit EHR use, looking closely at HIPAA security, incentive payments, possible fraud, and contingency plan requirements. Also during the HIPAA compliance audit, the Office of Civil Rights (OCR) will confirm whether hospitals’ policies and procedures meet updated security criteria.  Healthcare organizations should take this opportunity to verify compliance with 2013 HIPAA standards to prepare for upcoming audits. Many helpful resources exist, including HIPAA compliance toolkits, available from several publishers. These kits include advice on privacy and security models. Healthcare organizations and leaders can also take advantage of online education, or hire consultants to help review and implement the necessary measures. It’s important that action be taken now to educate staff about personal health information security and how to remain HIPAA compliant.

4. ICD-10 Deadline
The new ICD-10 deadline comes as no surprise now that it was delayed several times. In July 2014, the US Department of Health and Human Services (HHS) implemented the most recent delay and set a new date of Oct. 1, 2015, giving hospitals a 10-month window to prepare for the eventual ICD-10 rollout. Because healthcare organizations are more adaptable than ever, they can use their practiced flexibility and experience to meet these demands successfully.

As Health Information and Management Systems Society (HIMSS) suggests, communication, education and testing must be part of an ICD-10 implementation plan. Informing internal staff and external partners of the transition is a crucial first step. ICD-10 should be tested internally and externally to verify the system works with the new codes before the transition. Healthcare organizations should outline and develop an ICD-10 training program by selecting a training team and assessing the populations who need ICD-10 education. They should perform a gap analysis to understand the training needed and utilize role-based training to educate the proper populations. Finally, organizations should establish the training delivery method, whether online, in the classroom, one-on-one, or some combination of these to teach different topics or levels of proficiency. In my experience at The Breakaway Group, I’ve seen that the most effective and efficient education is role-based, readily-accessible, and offers learners hands-on experience performing tasks essential to their role. This type of targeted education ensures learners are proficient before the implementation. As with any go-live event, healthcare organizations must prepare and deliver the new environment, providing support throughout the event and beyond.

Facing 2015
These challenges require the same preparation, willingness, and audacity needed for prior HIT successes, including EHR implementation and meeting Meaningful Use Stage 1 requirements. ICD-10, HIPAA compliance, Stage 2, and interoperability all have the element of education in common. Healthcare organizations and leaders should apply the same tenacity and discipline to inform, educate, and prepare clinicians for upcoming obligations.

Targeted role-based education will best ensure proficiency and avoid comprehensive, costly, and time-consuming system training. Through role-based education, healthcare organizations gain more knowledgeable personnel who are up to speed on new applications. These organizations probably already have at least a foundation for 2015 expectations, and they should continue to recall the strategies used for prior go-live events. What was successful? It’s important to plan to replicate successful strategies, alleviating processes that caused problems.  This is great opportunity to capitalize efforts for organizational improvements. Healthcare leaders must let the necessity of 2015 government requirements inspire invention and innovation, ultimately strengthening their organizations.

Xerox is a sponsor of the Breakaway Thinking series of blog posts.

De-Identification of Data in Healthcare

Posted on January 14, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Today I had a chance to sit down with Khaled El Emam, PhD, CEO and Founder of Privacy Analytics, to talk about healthcare data and the de-identification of that healthcare data. Data is at the center of the future of healthcare IT and so I was interested to hear Khaled’s perspectives on how to manage the privacy and security of that data when you’re working with massive healthcare data sets.

Khaled and I started off the conversation talking about whether healthcare data could indeed be de-identified or not. My favorite Patient Privacy Rights advocate, Deborah C. Peel, MD, has often made the case for why supposedly de-identified healthcare data is not really private or secure since it can be re-identified. So, I posed that question to Khaled and he suggested that Dr. Peel is only telling part of the story when she references stories where healthcare data has been re-identified.

Khaled makes the argument that in all of the cases where healthcare data has been reidentified, it was because those organizations did a poor job of de-identifying the data. He acknowledges that many healthcare organizations don’t do a good job de-identifying healthcare data and so it is a major problem that Dr. Peel should be highlighting. However, just because one organization does a poor job de-identifying data, that doesn’t mean that proper de-identification of healthcare data should be thrown out.

This kind of reminds me of when people ask me if EHR software is secure. My answer is always that EHR software can be more secure than paper charts. However, it depends on how well the EHR vendor and the healthcare organization’s staff have done at implementing security procedures. When it’s done right, an EHR is very secure. When it’s done wrong, and EHR could be very insecure. Khaled is making a similar argument when it comes to de-identified health data.

Khaled did acknowledge that the risks are never going to be 0. However, if you de-identify healthcare data using proper techniques, the risks are small enough that they are similar to the risks we take every day with our healthcare data. I think this is an important point since the reality is that organizations are going to access and use healthcare data. That is not going to stop. I really don’t think there’s any debate on this. Therefore, our focus should be on minimizing the risks associated with this healthcare data sharing. Plus, we should hold organizations accountable for the healthcare data sharing their doing.

Khaled also suggested that one of the challenges the healthcare industry faces with de-identifying healthcare data is that there’s a shortage of skilled professionals who know how to do it properly. I’d suggest that many who are faced with de-identifying data have the right intent, but likely lack the skills needed to ensure that the healthcare data de-identification is done properly. This isn’t a problem that will be solved easily, but should be helped as data security and privacy become more important.

What do you think of de-identification in healthcare? Is the way it’s being done a problem today? I see no end to the use of data in healthcare, and so we really need to make sure we’re de-identifying healthcare data properly.

EHR Computer Setup

Posted on January 6, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I recently had a doctors visit at a local quick care. When I go to these visits, it’s almost like work since I’m interested in what EHR they’re using and what they think of the EHR, meaningful use, government money, ICD-10, etc.

In this case, the organization had an EHR for half of the work they did, but were still on paper for the other half. However, they were switching all of their work over to a new EHR the next week. I think they told me they gave them a couple hours of training to learn the new system (good luck with that).

While I was waiting in the exam room, I saw this wall mounted computer setup (pictured below):

EHR Wall Computer Setup

Obviously you can tell that this wall mounted computer wasn’t being used yet. It must have come with the new EHR roll out. I’ll be interested to go back again in the future and see how this computer is used. I’m a big proponent of computers in the room. Plus, this looks like a pretty good setup that stays out of the way when needed. Although, I wonder if the ergonomics of this setup will catch up with the clinic.

How do you have the computers setup in your exam rooms? I’d love to hear what you’re doing or even see pictures of your exam room computer setup. Do you just use a tablet or laptop you carry around with you? Let’s see some more examples.