Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

#HITMC Chat and Health IT Marketing and PR Conference Early Bird Registration Ends

Posted on January 27, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Today we held the first ever #HITMC (Healthcare IT Marketing and PR Community) Twitter chat. The turnout for the chat was amazing and it was so active I don’t think anyone could keep up. That’s pretty amazing for a first time chat. In case you missed it and are interested in health IT marketing and PR, here’s my tweet that links to the transcript:

I’m particularly interested to look back at the answer to question 3 on the chat which talks about the tools that people use to make their lives easier.

Here’s a look at the stats for the first HITMC chat:

All of this tells me that I should have started this twitter chat sooner. It’s amazing how a Twitter chat can really bring a community together. Plus, it always leads to interesting new connections that wouldn’t have happened otherwise. Tomorrow I’ll be participating in another new Twitter chat that’s focused on Health Information Governance. If that topic interests you, be sure to join us on #InfoTalk at Noon ET on January 28th.

We’re also 5 days away from the end of Early Bird Registration for the Health IT Marketing and PR Conference. Register now and save $500 off the registration price. Plus, as a reader of EMR and HIPAA, use the promo code “emrandhipaa” and you’ll save an extra $100. We’ve just started uploaded the speaker profiles for those who will be speaking at the event. It’s going to be a fantastic 2+ days of the best in healthcare IT marketing and PR. I can’t wait!

For those not interested in the above topics. Tomorrow we’ll be back with our regularly scheduled programming.

Speeding Sepsis Response by Integrating Key Technology

Posted on January 26, 2015 I Written By

Stephen Claypool, M.D., is Vice President of Clinical Development & Informatics, Clinical Solutions, with Wolters Kluwer Health and Medical Director of its Innovation Lab. He can be reached at steve.claypool@wolterskluwer.com.
Stephen Claypool - WKH
Three-week-old Jose Carlos Romero-Herrera was rushed to the ER, lethargic and unresponsive with a fever of 102.3. His mother watched helplessly as doctors, nurses, respiratory therapists and assorted other clinicians frantically worked to determine what was wrong with an infant who just 24 hours earlier had been healthy and happy.

Hours later, Jose was transferred to the PICU where his heart rate remained extremely high and his blood pressure dangerously low. He was intubated and on a ventilator. Seizures started. Blood, platelets, plasma, IVs, and multiple antibiotics were given. Still, Jose hovered near death.

CT scans, hourly blood draws and EEGs brought no answers. Despite all the data and knowledge available to the clinical team fighting for Jose’s life, it was two days before the word “sepsis” was uttered. By then, his tiny body was in septic shock. It had swelled to four times the normal size. The baby was switched from a ventilator to an oscillator. He received approximately 16 different IV antibiotics, along with platelets, blood, plasma, seizure medications and diuretics.

“My husband and I were overwhelmed at the equipment in the room for such a tiny little person. We were still in shock about how we’d just sat there and enjoyed him a few hours ago and now were being told that we may not be bringing him back home with us,” writes Jose’s mother, Edna, who shared the story of her baby’s 30-day ordeal as part of the Sepsis Alliance’s “Faces of Sepsis” series.

Jose ultimately survived. Many do not. Three-year-old Ivy Hayes went into septic shock and died after being sent home from the ER with antibiotics for a UTI. Larry Przybylski’s mother died just days after complaining of a “chill” that she suspected was nothing more than a 24-hour bug.

Sepsis is the body’s overwhelming, often-fatal immune response to infection. Worldwide, there are an estimated 8 million deaths from sepsis, including 750,000 in the U.S. At $20 billion annually, sepsis is the single most expensive condition treated in U.S. hospitals.

Hampering Efforts to Fight Sepsis

Two overarching issues hamper efforts to drive down sepsis mortality and severity rates.

First, awareness among the general population is surprisingly low. A recent study conducted by The Harris Poll on behalf of Sepsis Alliance found that just 44% of Americans had ever even heard of sepsis.

Second, the initial presentation of sepsis can be subtle and its common signs and symptoms are shared by multiple other illnesses. Therefore, along with clinical acumen, early detection requires the ability to integrate and track multiple data points from multiple sources—something many hospitals cannot deliver due to disparate systems and siloed data.

While the Sepsis Alliance focuses on awareness through campaigns including Faces of Sepsis and Sepsis Awareness Month, hospitals and health IT firms are focused on reducing rates by arming clinicians with the tools necessary to rapidly diagnose and treat sepsis at its earliest stages.

A primary clinical challenge is that sepsis escalates rapidly, leading to organ failure and septic shock, resulting in death in nearly 30 percent of patients. Every hour without treatment significantly raises the risk of death, yet early screening is problematic. Though much of the data needed to diagnose sepsis already reside within EHRs, most systems don’t have the necessary clinical decision support content or informatics functionality.

There are also workflow issues. Inadequate cross-shift communication, challenges in diagnosing sepsis in lower-acuity areas, limited financial resources and a lack of sepsis protocols and sepsis-specific quality metrics all contribute to this intractable issue.

Multiple Attack Points

Recognizing the need to attack sepsis from multiple angles, our company is testing a promising breakthrough in the form of POC Advisor™. The program is a holistic approach that integrates advanced technology with clinical change management to prevent the cascade of adverse events that occur when sepsis treatment is delayed.

This comprehensive platform is currently being piloted at Huntsville Hospital in Alabama and John Muir Medical Center in California. It works by leveraging EHR data and automated surveillance, clinical content and a rules engine driven by proprietary algorithms to begin the sepsis evaluation process. Mobile technology alerts clinical staff to evaluate potentially septic patients and determine a course of treatment based on their best clinical judgment.

For a truly comprehensive solution, it is necessary to evaluate specific needs at each hospital. That information is used to expand sepsis protocols and add rules, often hundreds of them, to improve sensitivity and specificity and reduce alert fatigue by assessing sepsis in complex clinical settings. These additional rules take into account comorbid medical conditions and medications that can cause lab abnormalities that may mimic sepsis. This helps to ensure alerts truly represent sepsis.

The quality of these alerts is crucial to clinical adoption. They must be both highly specific and highly sensitive in order to minimize alert fatigue. In the case of this specific system, a 95% specificity and sensitivity rating has been achieved by constructing hundreds of variations of sepsis rules. For example, completely different rules are run for patients with liver disease versus those with end-stage renal disease. Doing so ensures clinicians only get alerts that are helpful.

Alerts are also coupled with the best evidence-based recommendations so the clinical staff can decide which treatment path is most appropriate for a specific patient.

The Human Element

To address the human elements impacting sepsis rates, the system in place includes clinical change management to develop best practices, including provider education and screening tools and protocols for early sepsis detection. Enhanced data analytics further manage protocol compliance, public reporting requirements and real-time data reporting, which supports system-wide best practices and performance improvement.

At John Muir, the staff implemented POC Advisor within two medical/surgical units for patients with chronic kidney disease and for oncology patient populations. Four MEDITECH interfaces sent data to the platform, including lab results, pharmacy orders, Admit Discharge Transfer (ADT) and vitals/nursing documentation. A clinical database was created from these feeds, and rules were applied to create the appropriate alerts.

Nurses received alerts on a VoIP phone and then logged into the solution to review the specifics and determine whether they agree with the alerts based on their clinical training. The system prompted the nursing staff to respond to each one, either through acknowledgement or override. If acknowledged, suggested guidance regarding the appropriate next steps was provided, such as alerting the physician or ordering diagnostic lactate tests, based on the facility’s specific protocols. If alerts were overridden, a reason had to be entered, all of which were logged, monitored and reported. If action was not taken, repeat alerts were fired, typically within 10 minutes. If repeat alerts were not acted upon, they were escalated to supervising personnel.

Over the course of the pilot, the entire John Muir organization benefited from significant improvements on several fronts:

  • Nurses were able to see how data entered into the EHR was used to generate alerts
  • Data could be tracked to identify clinical process problems
  • Access to clinical data empowered the quality review team
  • Nurses reported being more comfortable communicating quickly with physicians based on guidance from the system and from John Muir’s standing policies

Finally, physicians reported higher confidence in the validity of information relayed to them by the nursing staff because they knew it was being communicated based on agreed upon protocols.

Within three months, John Muir experienced significant improvements related to key sepsis compliance rate metrics. These included an 80% compliance with patient screening protocols, 90% lactate tests ordered for patients who met screening criteria and 75% initiation of early, goal-directed therapy for patients with severe sepsis.

Early data from Huntsville Hospital is equally promising, including a 37% decline in mortality on patient floors where POC Advisor was implemented. Thirty-day readmissions have declined by 22% on screening floors, and data suggest documentation improvements resulting from the program may positively impact reimbursement levels.

This kind of immediate outcome is generating excitement at the pilot hospitals. Though greater data analysis is still necessary, early indications are that a multi-faceted approach to sepsis holds great promise for reducing deaths and severity.

The Next Generation Tech Kids

Posted on January 23, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Today I had the amazing opportunity to volunteer at my kids school. They make it a big deal for dad’s to volunteer at the school and my kids absolutely adore having their dad at school with them. We have a tradition that I go and spend the day at school with my kids on their birthdays. It’s pretty awesome and I might have even shed a tear or two. (Side Note: Check out my new Daddy Blog for cute pics of my kids)

However, that’s not the point of this post. It turns out today was testing day for a bunch of my kids (I have 3 in elementary school). What was amazing is that all of the test were administered on a computer. Yes, even my 5 year old kindergartner was taking his test on the computer. In fact the teacher told me, “It’s kind of hard because they don’t even really know how to type.”

Whether this is a good idea or not, is a topic for an education blog. However, I’ve written before about the next generation of digital natives and the impact they’ll have on healthcare and EHR. If we look a little further out, my 5 year old won’t even be able to comprehend the idea of a paper chart. It will be so ridiculous to him.

I’m still processing what this will mean to healthcare IT and to society in general. As I think back on the thousands of blog posts I’ve written about adopting EHR, I can think of many that will sound ridiculous even 5-10 years from now. That has me very excited. Not that my content is no longer useful (unless you enjoy Health IT history). I’m excited that a whole sea change is going to happen in how we want technology applied to healthcare.

No doubt, it’s not without some risk. I’ve heard many argue that the next generation doesn’t care about privacy. Personally I’ve seen quite the opposite. The next generation has a very sophisticated approach to privacy. They know when and where to share something based on who and what they want to see it. It’s the older generation that has a problem knowing exactly where something should be shared and where it shouldn’t. That’s not to say that some young kids don’t make mistakes. They do, but most are quite aware of where something is being shared. It’s why so many kids use snapchat.

What do you think of the coming generations of technology savvy people? What benefits will they bring? What challenges will we face? Are you excited, scared, nervous?

Beware: Don’t Buy In to Myths about Data Security and HIPAA Compliance

Posted on January 22, 2015 I Written By

The following is a guest blog post by Mark Fulford, Partner in LBMC’s Security & Risk Services practice group.
Mark Fulford
Myths abound when it comes to data security and compliance. This is not surprising—HIPAA covers a lot of ground and many organizations are left to decide on their own how to best implement a compliant data security solution. A critical first step in putting a compliant data security solution in place is separating fact from fiction.  Here are four common misassumptions you’ll want to be aware of:

Myth #1: If we’ve never had a data security incident before, we must be doing OK on compliance with the HIPAA Security Rule.

It’s easy to fall into this trap. Not having had an incident is a good start, but HIPAA requires you to take a more proactive stance. Too often, no one is dedicated to monitoring electronic protected health information (ePHI) as prescribed by HIPAA. Data must be monitored—that is, someone must be actively reviewing data records and security logs to be on the lookout for suspicious activity.

Your current IT framework most likely includes a firewall and antivirus/antimalware software, and all systems have event logs. These tools collect data that too often go unchecked. Simply assigning someone to review the data you already have will greatly improve your compliance with HIPAA monitoring requirements, and more importantly, you may discover events and incidents that require your attention.

Going beyond your technology infrastructure, your facility security, hardcopy processing, workstation locations, portable media, mobile device usage and business associate agreements all need to be assessed to make sure they are compliant with HIPAA privacy and security regulations. And don’t forget about your employees. HIPAA dictates that your staff is trained (with regularly scheduled reminders) on how to handle PHI appropriately.

Myth #2: Implementing a HIPAA security compliance solution will involve a big technology spend.

This is not necessarily the case.  An organization’s investment in data security solutions can vary, widely depending on its size, budget and the nature of its transactions. The Office for Civil Rights (OCR) takes these variables into account—certainly, a private practice will have fewer resources to divert to security compliance than a major corporation. As long as you’ve justified each decision you’ve made about your own approach to compliance with each of the standards, the OCR will take your position into account if you are audited.

Most likely, you already have a number of appropriate technical security tools in place necessary to meet compliance. The added expense will more likely be associated with administering your data security compliance strategy.

Myth #3: We’ve read the HIPAA guidelines and we’ve put a compliance strategy in place. We must be OK on compliance.

Perhaps your organization is following the letter of the law. Policies and procedures are in place, and your staff is well-trained on how to handle patient data appropriately. By all appearances, you are making a good faith effort to be compliant.

But a large part of HIPAA compliance addresses how the confidentiality, integrity, and availability of ePHI is monitored in the IT department. If no one on the team has been assigned to monitor transactions and flag anomalies, all of your hard work at the front of the office could be for naught.

While a ‘check the box’ approach to HIPAA compliance might help if you get audited, unless it includes the ongoing monitoring of your system, your patient data may actually be exposed.

Myth #4: The OCR won’t waste their time auditing the ‘little guys.’ After all, doesn’t the agency have bigger fish to fry?

This is simply not true. Healthcare organizations of all sizes are eligible for an audit. Consider this cautionary tale: as a result of a reported incident, a dermatologist in Massachusetts was slapped with a $150,000 fine when an employee’s thumb drive was stolen from a car.

Fines for non-compliance can be steep, regardless of an organization’s size. If you haven’t done so already, now might be a good time to conduct a risk assessment and make appropriate adjustments. The OCR won’t grant you concessions just because you’re small, but they will take into consideration a good faith effort to comply.

Data Security and HIPAA Compliance: Make No Assumptions

As a provider, you are probably aware that the audits are starting soon, but perhaps you aren’t quite sure what that means for you. Arm yourself with facts. Consult with outside sources if necessary, but be aware that the OCR is setting the bar higher for healthcare organizations of all sizes. You might want to consider doing this, too. Your business—and your patients—are counting on it.

About Mark Fulford
Mark Fulford is a Partner in LBMC’s Security & Risk Services practice group.  He has over 20 years of experience in information systems management, IT auditing, and security.  Marks focuses on risk assessments and information systems auditing engagements including SOC reporting in the healthcare sector.  He is a Certified Information Systems Auditor (CISA) and Certified Information Systems Security Professional (CISSP).   LBMC is a top 50 Accounting & Consulting firm based in Brentwood, Tennessee.

Digital Health at CES Wrap Up Video

Posted on January 21, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

CES 2015 is now in the headlights. One person I talked to said they thought that the event was missing some of the excitement of previous years. I disagreed with him. I thought it was more exciting than previous years. Although, my excitement comes from the entrepreneurs and the Digital Health space. If you look at the larger CES floor with the massive million dollar booths, it was lacking some luster. Of course, with the size of CES, it’s easy to understand why two people could have very different experiences.

If you’re interested about what else I found at CES, I sat down with Dr. Nick van Terheyden, CMIO at Nuance, to talk about our experiences at CES 2015 and some of the takeaways from what we saw. I think you’ll enjoy this CES 2015 video chat below:

Defining the Legal Health Record, Ensuring Quality Health Data, and Managing a Part-Paper Part-Electronic Record – Healthcare Information Governance

Posted on January 20, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

This post is part of Iron Mountain’s Healthcare Information Governance: Big Picture Predictions and Perspectives Series which looks at the key trends impacting Healthcare Information Governance. Be sure to check out all the entries in this series.

Healthcare information governance (IG) has been important ever since doctors started tracking their patients in paper charts. However, over the past few years, adoption of EHR and other healthcare IT systems has exploded and provided a myriad of new opportunities and challenges associated with governance of a healthcare organization’s information.

Three of the most important health information governance challenges are:
1. Defining the legal health record
2. Ensuring quality health data
3. Managing a part-paper, part-electronic record

Defining the Legal Health Record
In the paper chart world, defining the legal health record was much easier. As we’ve shifted to an electronic world, the volume of data that’s stored in these electronic systems is so much greater. This has created a major need to define what your organization considers the legal health record.

The reality is that each organization now has to define its own legal health record based on CMS and accreditation guidelines, but also based on the specifics of their operation (state laws, EHR options, number of health IT systems, etc). The legal health record will only be a subset of the data that’s being stored by an EHR or other IT system and you’ll need to involve a wide group of people from your organization to define the legal health record.

Doing so is going to become increasingly important. Without a clearly defined legal health record, you’re going to produce an inconsistent release of information. This can lead to major liability issues in court cases where you produce inconsistent records, but it’s also important to be consistent when releasing health information to other doctors or even auditors.

One challenge we face in this regard is ensuring that EHR vendors provide a consistent and usable data output. A lot of thought has been put into how data is inputted into the EHR, but not nearly as much effort has been put into the way an EHR outputs that data. This is a major health information governance challenge that needs to be addressed. Similarly, most EHR vendors haven’t put much thought and effort into data retention either. Retention policies are an important part of defining your legal health record, but your policy is subject to the capabilities of the EHR.

Working with your EHR and other healthcare IT vendors to ensure they can produce a consistent legal health record is one strategic imperative that every healthcare organization should have on their list.

Ensuring Quality Health Data
The future of healthcare is very much going to be data driven. Payments to ACO organizations are going to depend on data. The quality of care you provide using Clinical Decision Support (CDS) systems is going to rely on the quality of data being used. Organizations are going to have new liability concerns that revolve around their organization’s data quality. Real time data interoperability is going to become a reality and everyone’s going to see everyone else’s data without a middleman first checking and verifying the quality of the data before it’s sent.

A great health information governance program led by a clinical documentation improvement (CDI) program is going to be a key first step for every organization. Quality data doesn’t happen over night, but requires a concerted effort over time. Organization need to start now if they want to be successful in the coming data driven healthcare world.

Managing a Part-Paper Part-Electronic Record
The health information world is becoming infinitely more complex. Not only do you have new electronic systems that store massive amounts of data, but we’re still required to maintain legacy systems and those old paper charts. Each of these requires time and attention to manage properly.

While we’d all love to just turn off legacy systems and dispose of old paper charts, data retention laws often mean that both of these will be part of every healthcare organization for many years to come. Unfortunately, most health IT project plans don’t account for ongoing management of these old but important data sources. This inattention often results in increased costs and risks associated with these legacy systems and paper charts.

It should be strategically important for every organization to have a sound governance plan for both legacy IT systems and paper charts. Ignorance is not bliss when one of these information sources is breached because your organization had “forgotten” about them.

The future of reimbursement, costs, quality of care, and liability in healthcare are all going to be linked to an organization’s data. Making sure your data governance house is in order is going to be a major component in the success or failure of your organization. A good place to start is defining the legal health record, ensuring quality health data, and managing a part-paper part-electronic record.

Join our Twitter Chat: “Healthcare IG Predictions & Perspectives”

On January 28th at 12:00 pm Eastern, @IronMtnHealth is hosting a Twitter chat using #InfoTalk to further the dialog. If you have been involved in governance-related projects, we’d love to have you join. What IG initiatives have shown success for you? How have you overcome any obstacles? What do you see as the future of IG? Keep the conversation going during our “Healthcare IG Predictions & Perspectives” #InfoTalk at 12pm Eastern on January 28th.

The Value of an Integrated Specialty EHR Approach

Posted on January 19, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

As many of you know, I’ve long been an advocate for the specialty specific EHR. There are just tremendous advantages in having an EHR that’s focused only on your specialty. Then, you don’t get things like child growth charts cluttering your EHR when you don’t see any children. Or taken the other way, you have child growth charts that are designed specifically for a pediatrician. This can be applied across pretty much every industry.

The reason that many organizations don’t go with a specialty specific EHR is usually because they’re a large multi specialty organization. These organizations don’t want to have 30 different EHR vendors that they have to support. Therefore, in their RFP they basically exclude specialty specific EHR vendors from their EHR selection process.

I understand from an IT support perspective and EHR implementation perspective how having 30 different EHR implementation would be a major challenge. However, it’s also a challenge to try and get one EHR vendor to work for 30+ specialties as well. Plus, the long term consequence is physician and other EHR user dissatisfaction using an EHR that wasn’t designed for their specialty. The real decision these organizations are making is whether they want to put the burden on the IT staff (ie. supporting multiple EHRs) or whether they want to put the burden on the doctors (ie. using an EHR that doesn’t meet their needs). In large organizations, it seems that they’re making the decision to put the burden on the doctors as opposed to the IT staff. Although, I don’t think many organizations realize that this is the choice they’re making.

Specialty EHR vendor, gMed, recenlty put out a whitepaper which does an analysis and a kind of case study on the differences between a integrated GI practice and a non-integrated GI practice. In this case, they’re talking about an EHR that’s integrated with an ambulatory surgery center and one that’s not. That’s a big deal for a specialty like GI. You can download the free whitepaper to get all the juicy details and differences between an integrated GI practice and one that’s not.

I’ve been seeing more and more doctors starting to talk about their displeasure with their EHR. I think much of that displeasure comes thanks to meaningful use and reimbursement requirements, but I also think that many are suffering under an EHR that really doesn’t understand their specialty. From my experience those EHR vendors that claim to support every specialty, that usually consists of one support rep for that specialty and a few months programming sprint to try and provide something special for that specialty. That’s very different than a whole team of developers and every customer support person at the company devoted to a specialty.

I’m not saying that an EHR can’t do more than one specialty, but doing 5 somewhat related specialties is still very different than trying to do the 40+ medical specialties with one interface. One challenge with the best of breed approach is that there are some specialties which don’t have an EHR that’s focused just on them. In that case, you may have to use the every specialty EHR.

What’s clear to me is that most large multi specialty organizations are choosing the all-in-one EHR systems in their offices. I wonder if force feeding an EHR into a specialty where it doesn’t fit is going to eventually lead to a physician revolt back to specialty specific EHRs. Physician dissatisfaction, liability issues, and improved interoperability could make the best of breed approach much more attractive to even the large organizations. Even if it means they back into a best of breed approach after trying the one-size-fits all approach to EHR.

I’ll be interested to watch this dynamic playing out. Plus, you have the specialty doctors coming together in mega groups in order to combat against this as well. What do you think is going to happen with specialty EHR? Should organizations be doing a best of breed approach or the one-size-fits all EHR? What are the consequences (good and bad) of either direction?

Full Disclosure: gMed is an advertiser on this site.

Never Sell Your EHR Company – According to eCW Founder

Posted on January 16, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I recently came across an interesting article in Entrepreneur magazine authored by Girish Navani, CEO and Co-founder of eClinicalWorks. If you read this site, you know doubt are familiar with the quite popular eCW EHR software. In this article Girish gives some interesting insight into the future of eCW as a company:

After grad school, I set out to create my own version of my father’s bridge. After working many odd jobs developing software, I created credit check software for an acquaintance’s business. This made him a lot of money, which prompted me to ask (perhaps naively) for a share of the profit. I had developed a very successful facet of the company – didn’t I deserve it? His response surprised me, but I will never forget it. He said, “If you build something you like, don’t sell it.”

Twenty years later, I still remember my acquaintance’s advice. For that reason, my company, eClinicalWorks is, and always will be, a privately-held company. I have no interest in selling it, regardless of any offer I may get. In addition, we don’t use investor cash or spend money we don’t have.

This is not a philosophy that is unique to eCW. #1 on Epic’s list of principles is “Do not go public.” I imagine that Judy Faulkner (CEO of Epic) has a somewhat similar philosophy to Girish. There are certainly a lot of advantages to not going public and most of them get down to control. I’ll never forget when I heard one of the Marriott children talk about their decision to stay a private company. He said that Marriott would likely be a lot bigger if they had become a public company, but they would have lost a lot of the company culture if they’d chose to do so.

I imagine this is a similar feeling that Epic and eCW share. However, there’s also some accountability that comes with being a public company as well. It’s not easy for an organization to assess the financial well being of a private company. During the golden age of EHR which we just experienced, that hasn’t been an issue for either eCW or Epic. However, as we exit this golden age of EHR that was propped up by $36 billion in government stimulus money, the financial future may be quite different.

As in most things in life, there are pros and cons to staying private or going public. It’s interesting that two of the major EHR players (eCW and Epic) have made it clear that they have no interest in ever going public. We’ll see how that plays out long term.

Top 4 HIT Challenges and Opportunities for Healthcare Organizations in 2015 – Breakaway Thinking

Posted on January 15, 2015 I Written By

The following is a guest blog post by Mitchell Woll, Instructional Designer at The Breakaway Group (A Xerox Company). Check out all of the blog posts in the Breakaway Thinking series.
Mitchell Woll - The Breakaway Group
Healthcare organizations face numerous challenges in 2015: ICD-10 implementation, HIPAA compliance, new Meaningful Use objectives, and the Office of the National Coordinator’s (ONC) interoperability road map.  To adapt successfully, organizations must take advantage of numerous opportunities to prepare.

Healthcare leaders must thoroughly assess, prioritize, prepare, and execute in each area:

  1. Meaningful Use Stage 2 objectives require increased patient engagement and reporting for a full year before earning incentives.
  2. The ONC’s interoperability road map demands a new framework to achieve successful information flow between healthcare systems over the next ten years.
  3. There are 10 months left in which to prepare for the October 1 ICD-10 deadline.
  4. HIPAA compliance will be audited.

1. Meaningful Use
For those who have already implemented an EHR, Meaningful Use Stage 2 focuses new efforts on patient access to personal health data and emphasizes the exchange of health information between patient and providers. Stage 2 also imposes financial penalties for failure to meet requirements.

CMS’s latest deadline for Stage 2 extends through 2016, so healthcare organizations have additional time to fulfill Stage 2 requirements. Stage 3 requirements begin in 2017, so healthcare organizations should take the extra time to build interoperability and foster an internal culture of collaboration between providers and patients. For Stage 3, Medicare incentives will not apply in 2017 and EHR penalties will rise to 3 percent.

CMS has also proposed a 2015 EHR certification, which requests interoperability enhancement to support transitions of care.  Complying with this certification is voluntary, but provides the opportunity to become certified for Medicare and Medicaid EHR incentive programs at the same time.

Meaningful Use Stage 2 and the ONC roadmap require that 2015 efforts concentrate on interoperability. Healthcare organizations should prepare for health information exchange by focusing efforts on building patient portals and integrating communications by automating phone, text, and e-mail messages. After setting up successful exchange methods, healthcare organizations should train staff how to use patient portals. The delay in Stage 2 means providers have more time to become comfortable using the technology to correspond with patients. Hospitals should also educate patients about these resources, describing the benefits of collaboration between providers and patients. Positive collaboration and successful data exchange helps achieve desired health outcomes faster.

2. Interoperability
The three-year goal of the ONC’s 10-year roadmap is for providers and patients to be able to send, receive, find, and use basic health information. The six and ten-year goals then build on the initial objectives, improving interoperability into the future.

Congress has also shown initiative on promoting interoperability asking the ONC to investigate information blocking by EHRs. Most of the ONC’s roadmap for the next three years is similar to Meaningful Use Stage 2 goals.

Sixty-four percent of Americans do not use patient portals, so for 2015 healthcare organizations should focus on creating them, refining their workflows, and encouraging patients to use them. Additionally, 35 percent of patients said they are unaware of patient portals, while 31 percent said their physician has never mentioned them. Fifty-six percent of patients ages 55-64, and 46 percent of patients 65 and older, said they would access medical information more if it were available online. Hospitals need their own staff to use and promote patient portals in order to conquer the challenges of interoperability and Stage 2.

3. HIPAA Compliance
In 2015, the Office of the Inspector General (OIG) will audit EHR use, looking closely at HIPAA security, incentive payments, possible fraud, and contingency plan requirements. Also during the HIPAA compliance audit, the Office of Civil Rights (OCR) will confirm whether hospitals’ policies and procedures meet updated security criteria.  Healthcare organizations should take this opportunity to verify compliance with 2013 HIPAA standards to prepare for upcoming audits. Many helpful resources exist, including HIPAA compliance toolkits, available from several publishers. These kits include advice on privacy and security models. Healthcare organizations and leaders can also take advantage of online education, or hire consultants to help review and implement the necessary measures. It’s important that action be taken now to educate staff about personal health information security and how to remain HIPAA compliant.

4. ICD-10 Deadline
The new ICD-10 deadline comes as no surprise now that it was delayed several times. In July 2014, the US Department of Health and Human Services (HHS) implemented the most recent delay and set a new date of Oct. 1, 2015, giving hospitals a 10-month window to prepare for the eventual ICD-10 rollout. Because healthcare organizations are more adaptable than ever, they can use their practiced flexibility and experience to meet these demands successfully.

As Health Information and Management Systems Society (HIMSS) suggests, communication, education and testing must be part of an ICD-10 implementation plan. Informing internal staff and external partners of the transition is a crucial first step. ICD-10 should be tested internally and externally to verify the system works with the new codes before the transition. Healthcare organizations should outline and develop an ICD-10 training program by selecting a training team and assessing the populations who need ICD-10 education. They should perform a gap analysis to understand the training needed and utilize role-based training to educate the proper populations. Finally, organizations should establish the training delivery method, whether online, in the classroom, one-on-one, or some combination of these to teach different topics or levels of proficiency. In my experience at The Breakaway Group, I’ve seen that the most effective and efficient education is role-based, readily-accessible, and offers learners hands-on experience performing tasks essential to their role. This type of targeted education ensures learners are proficient before the implementation. As with any go-live event, healthcare organizations must prepare and deliver the new environment, providing support throughout the event and beyond.

Facing 2015
These challenges require the same preparation, willingness, and audacity needed for prior HIT successes, including EHR implementation and meeting Meaningful Use Stage 1 requirements. ICD-10, HIPAA compliance, Stage 2, and interoperability all have the element of education in common. Healthcare organizations and leaders should apply the same tenacity and discipline to inform, educate, and prepare clinicians for upcoming obligations.

Targeted role-based education will best ensure proficiency and avoid comprehensive, costly, and time-consuming system training. Through role-based education, healthcare organizations gain more knowledgeable personnel who are up to speed on new applications. These organizations probably already have at least a foundation for 2015 expectations, and they should continue to recall the strategies used for prior go-live events. What was successful? It’s important to plan to replicate successful strategies, alleviating processes that caused problems.  This is great opportunity to capitalize efforts for organizational improvements. Healthcare leaders must let the necessity of 2015 government requirements inspire invention and innovation, ultimately strengthening their organizations.

Xerox is a sponsor of the Breakaway Thinking series of blog posts.

De-Identification of Data in Healthcare

Posted on January 14, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Today I had a chance to sit down with Khaled El Emam, PhD, CEO and Founder of Privacy Analytics, to talk about healthcare data and the de-identification of that healthcare data. Data is at the center of the future of healthcare IT and so I was interested to hear Khaled’s perspectives on how to manage the privacy and security of that data when you’re working with massive healthcare data sets.

Khaled and I started off the conversation talking about whether healthcare data could indeed be de-identified or not. My favorite Patient Privacy Rights advocate, Deborah C. Peel, MD, has often made the case for why supposedly de-identified healthcare data is not really private or secure since it can be re-identified. So, I posed that question to Khaled and he suggested that Dr. Peel is only telling part of the story when she references stories where healthcare data has been re-identified.

Khaled makes the argument that in all of the cases where healthcare data has been reidentified, it was because those organizations did a poor job of de-identifying the data. He acknowledges that many healthcare organizations don’t do a good job de-identifying healthcare data and so it is a major problem that Dr. Peel should be highlighting. However, just because one organization does a poor job de-identifying data, that doesn’t mean that proper de-identification of healthcare data should be thrown out.

This kind of reminds me of when people ask me if EHR software is secure. My answer is always that EHR software can be more secure than paper charts. However, it depends on how well the EHR vendor and the healthcare organization’s staff have done at implementing security procedures. When it’s done right, an EHR is very secure. When it’s done wrong, and EHR could be very insecure. Khaled is making a similar argument when it comes to de-identified health data.

Khaled did acknowledge that the risks are never going to be 0. However, if you de-identify healthcare data using proper techniques, the risks are small enough that they are similar to the risks we take every day with our healthcare data. I think this is an important point since the reality is that organizations are going to access and use healthcare data. That is not going to stop. I really don’t think there’s any debate on this. Therefore, our focus should be on minimizing the risks associated with this healthcare data sharing. Plus, we should hold organizations accountable for the healthcare data sharing their doing.

Khaled also suggested that one of the challenges the healthcare industry faces with de-identifying healthcare data is that there’s a shortage of skilled professionals who know how to do it properly. I’d suggest that many who are faced with de-identifying data have the right intent, but likely lack the skills needed to ensure that the healthcare data de-identification is done properly. This isn’t a problem that will be solved easily, but should be helped as data security and privacy become more important.

What do you think of de-identification in healthcare? Is the way it’s being done a problem today? I see no end to the use of data in healthcare, and so we really need to make sure we’re de-identifying healthcare data properly.