Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

California’s Information Privacy for Connected Devices Law is a Good Start, But Doesn’t Apply to Healthcare

Posted on December 13, 2018 I Written By

The following is a guest blog post by Mike Nelson, Vice President of IoT Security, DigiCert.

As the nation’s most populous state, California often serves as an incubator for national legislative and regulatory policy, and it’s great to see them take a leadership position in IoT cybersecurity. The announcement of California’s ‘IoT Cybersecurity Law’ is a move in the right direction. The new law will require manufacturers of connected devices to produce them with “reasonable” security features.

However, this law specifically excludes healthcare IoT devices. It states that a covered entity, provider of healthcare, business associate, healthcare service plan, contractor, employer, or any other person subject to HIPAA or the Confidentiality of Medical Information Act shall not be subject to this title with respect to any activity regulated by those acts.

While HIPAA has made great strides to help protect the privacy of personal health information, it does very little to protect the many connected medical devices that are in use today. California lawmakers missed an opportunity to drive strong IoT security requirements that protect consumers and the data they want kept confidential.

Additionally, this law will not solve the majority of cybersecurity issues that are being found in IoT devices. For example, the law requires good password practices, which includes the elimination of hard-coded passwords.  While this is a security best practice and is important for user authentication, it doesn’t cover the many back end connections that also need to be authenticated, such as over-the-air updates. Asking for “reasonable” security features to be produced simply isn’t directional enough.  It misses an opportunity to drive requirements around essential cybersecurity practices, like encryption of sensitive data, risk assessments, authenticating all connections to a device, and digitally signing code to ensure integrity.

A general rule of cybersecurity and connectivity is that whenever something becomes connected, it will eventually get hacked. The risks inherent with connected devices are real – especially in healthcare where in many cases, people rely on these devices to sustain life.  The risks of connectivity are diverse, including intercepting and manipulating sensitive data, or embedding malware that causes a device to malfunction and cause harm to a patient. The risks not only can impact patients, they can also harm the device manufacturers as well. 

St. Jude Medical, now Abbott Laboratories, learned this the hard way. A hacking organization publicized a vulnerability in a cardiac device after purchasing a short position of their stock. Upon release of this vulnerability, the company’s s stock dropped significantly, causing financial and reputational damage to St. Jude. Considering all these risks, and the many others I haven’t mentioned, it becomes clear that simply putting in place good password protections isn’t enough. More direction is needed. While it may sound like I’m advocating for stronger regulation, I’m not. I believe industries do much better when they come together and collaboratively develop best practices that are broadly adopted. Regulators can only do so much. Real solutions require the in-depth knowledge of healthcare practices and what the market can bear – something only companies and practitioners can tackle effectively, but the private sector needs to do more.

We need to begin looking at security more broadly than just hardcoded passwords. As a healthcare industry, we need to practice robust penetration testing and work to develop risk assessments on all connected medical devices. We need to make the encryption of sensitive data, both at rest and in transit, standard practice. No medical device should accept an unauthenticated message. No code or package should be executed on a device that is absent a digital signature verifying trust. Driving requirements around these types of best practices would have a much greater effect on the security of connected devices than the new California law currently does.

Though the IoT Cybersecurity Law is primitive in its protections and lacks many details to require strong security measures that would move the needle, at least California is trying to do something – absent the development of industry standards by collaborative groups. As the first of its kind at the state level, the effort should be applauded, as California is recognizing the need for manufacturers to address cybersecurity in the manufacturing process for connected devices. Time will tell if manufacturers will take responsibility and the initiative for security themselves, before further regulation requires them to act.

5 Ways in which Big Data is Advancing Telemedicine

Posted on December 12, 2018 I Written By

The following is a guest blog post by Rahul Varshneya, Co-Founder and President of Arkenea and Benchpoint.

The healthcare industry is rapidly incorporating changes in technology. There is a gradual shift from the service based model of healthcare which primarily focused on curing the ailments, to a more holistic outcome-based approach which not only involves exploring different treatment modalities but actually aims at understanding the causative factors behind various ailments and eliminating them.

There has been a significant increase in health data, both structured and unstructured that is being generated. The high complexity level of this data requires it’s processing by big data analytics to come up with relevant and applicable inferences. Telemedicine is also harnessing the power of big data to improve the existing healthcare facilities.

The market size for Telemedicine is expected to increase to 41.2 billion dollars by 2021 and Big Data analytics is going to play a major role in this surge.

Here are the ways in which Big Data is contributing to the advancement of Telemedicine.

1. Patient Health Tracking and Predictive Analytics

The biggest benefit of the application of Big data in Telemedicine is the identification of potential health problems before their transformation into aggravating conditions. This has become a reality with the advent of the Internet of Medical Things (IoMT) in the form of wearable fitness trackers and other wearable health monitors which collect patient data on a real-time basis.

Application of Big data analytics techniques to this data ensures that patients vitals and statistics are constantly monitored. Telemedicine facilitates regular interactions with healthcare professionals without having to visit the doctor’s office. It also ensures that the physicians are constantly updated about the patient’s health status resulting in early detection of any anomaly.

The historical data collected is used for predictive analytics of the possible future outcomes. Creating risk scores on the basis of data from various sources is important for the identification of individuals at elevated risks of developing chronic ailments at the early stage of disease progression.

2. Remote Patient Monitoring and Post Discharge Prophylaxis

Post-discharge monitoring of patients and appointments with the physicians via telemedicine saves unnecessary visits to the doctor’s clinic. This is also a boon in the case of elderly and debilitated patients who cannot make frequent trips to the hospital for regular checkups. Vital patient stats like blood pressure and heart rate are collected by the use of health devices which have advanced sensors attached to them.

The data collected is processed using analytics techniques to compute the effective dosage of medication to be administered and helps the physician decide the course of treatment to be prescribed.

The clinicians are able to make use of numerous healthcare based apps to remotely monitor the patient condition and be on the lookout for signs of disease progression. This helps keeping the patients out of the hospitals, ensures that healthcare providers’ effort are focused on caring for patients in critical condition and also keeps the cost of healthcare relatively low by avoiding unnecessary hospitalization.

3. Accurate Diagnosis and Precision Medicine

Historically, the diagnostic process relied solely on patients relaying the symptoms to the doctor and doctor noticing the clinical signs of disease. The tests ordered further confirmed the doctor’s diagnosis and a treatment plan was prescribed. Now instead of subjective symptoms reported by the patient, the doctors can base their diagnosis on the patient data collected regularly by the wearable devices. Furthermore, the benefit of telemedicine is that the doctor and the patient don’t even need to be in the same geographical location for the diagnosis to take place!

Application of Big data in Telemedicine not only results in a more accurate diagnosis, but it also is a giant leap from traditional generic medicine into the realm of precision medicine curated specifically for each individual. The data collected from patients’ wearable devices, healthcare based apps, patients’ electronic health records, and genomics data can be tapped into for developing a medication that caters to patients individually.

Precision medicine takes into account the variation in lifestyles, genetic makeup, and environmental conditions for each individual. Big data makes it possible to compute the relevant data collected from various sources and helps the healthcare professionals come up with a treatment plan specific to each individual.

4. Cloud Computing and Specialist Outreach

The sheer volume of health data generated has led to the storage of patients EHRs and EMRs on the cloud. The benefit of telemedicine is that the patient data can be remotely accessed and treatment can be prescribed irrespective of the geographical location of the patient and the healthcare provider. It is of great advantage in case of a referral to the specialist who may be at a different location than the patient.

Secure access to the cloud ensures that physical location is no longer a variable in availing the best treatment possible. It is also beneficial to the healthcare providers as it allows for better scheduling of the doctor’s time increasing the effectiveness of care. Cloud storage is a precursor to the emergence of big data and acts as its facilitator.

5. Predicting Infection Trends and Timely Interventions

Application of deep learning algorithms across healthcare related data can be instrumental for prediction of infectious diseases and studying the patterns and trends of infection spread. The importance of data-based infectious disease surveillance studies has been recognized by a number of researchers across the world. These studies are important for supplementing the existing systems and designing of newer models of disease progression.

Big data in the form of Internet search queries are also being utilized for understanding disease trends, predicting the spread of infectious diseases. Once the regions affected by the infection are identified, the benefits of telemedicine come to light. Physician interactions with the affected populations and deployment of treatment modalities to the infected patients by use of tools like teleconferencing result in timely intervention and prevent the further spread of infection.

Conclusion

Big data analytics gives the physicians access to massive volumes of information which increases the diagnostic accuracy and results in efficiency in healthcare delivery. Combining the power of Telehealth with Big data has the potential to transform the healthcare delivery system and is of immense benefit to both the patients as well as healthcare providers.

Data security and privacy concerns are the biggest threats to this advancements. Enforcement of appropriate security measures need to ensured so that the vast reservoir of healthcare data can be harnessed to its full potential.

About Rahul Varshneya
Rahul Varshneya is the co-founder and President of Arkenea and Benchpoint. Rahul has been featured as a business technology thought leader in numerous media channels such as Bloomberg TV, Forbes, HuffPost, Inc, among others.

Battling the Barriers in EP/Cath Labs

Posted on December 5, 2018 I Written By

The following is a guest blog post by Tom Downes, CEO of Quail Digital.

Clear and unambiguous communication between team members is an essential component of any surgical environment. It’s particularly important – and indeed particularly challenging – in cath labs and electrophysiology (EP) labs where physicians and clinical staff in interventional cardiovascular and other minimally-invasive therapeutics are typically spread across multiple rooms and physically separated by lead-lined doors.

But as patient demand continues to rapidly grow, the inherent complexities of the surgical environment are presenting significant communication challenges between the surgeons, clinicians and nurses. These restrictions are creating great stressors for the whole operating team as they strive to continue to deliver a proficient patient service.

Creating Clear Communication

Stress amongst hospital staff is not just a recognised problem, it’s an escalating one. A study evaluating burnout among surgeons has found that 80% of surgeons agree burnout and stress are issues they should be monitored for. In light of this, it’s clear that maintaining the well-being of healthcare professionals is a challenge, and one that needs to be addressed quickly.

Previous studies have revealed that a number of potential stressors can compromise performance in the OR, including team interaction and extreme noise. It is therefore clear that problems with communication is one of the main barriers that needs to be broken down in order to achieve a tranquil, organized environment that will alleviate pressure in the operating room. The chatter of workmates, the hum of the air conditioning and the relentless drone of essential technology, combines to create a high-stress clinical environment where the multidisciplinary teams’ need for serenity is commonly confounded by practical necessities they cannot change.

Implementing clear and immediate communication will be a positive step towards reducing the complexities of the clinical space. Failure to do this risks squandering the undoubted benefits of surgical innovation; the patient implications of an avoidable clinical error due to miscommunication could, in the worst extremes, be catastrophic. Fortunately, communication technology has evolved to present a simple, affordable solution.

Adopting a Wireless Approach

Traditionally, facilities have adopted primitive measures to deliver communication between the OR and the monitoring suite, including basic hand gestures and microphones in each room. But this approach comes with challenges, for example, the likelihood of mishearing and misreading a fellow surgeon, or instruction, is naturally increased which could then delay the procedure and cause frustration. Another thing to consider is that all the medical team will be equipped with masks, making it difficult to hear and see dated visual and auditory clues.

By adopting wireless headset technology, physicians can transform the OR, EP and cath lab experience, as well as the working environment for the whole team involved in the procedure. The technology, which operates on high quality digital frequencies and is encrypted to avoid interference from other devices or emissions in the OR, enables multidisciplinary teams to collaborate and communicate – hands-free – in the interventional OR or hybrid suite, at monitoring stations, through adjacent control rooms and ancillary areas. A lack of clarity can create stress and blame amongst the operating team, but with the ability to hear instructions clearly in every clinical environment this ambiguity can be avoided. Additionally, the pressures placed upon surgeons will be drastically reduced as they have the confidence of knowing that every member of staff is able to perform their role in a more assured manner.

And as it has been suggested that high-quality teamwork among operating room professionals is key to efficient and safe practice, implementing a system that initiates better communication between staff will be extremely beneficial to the clinical environment. Creating a more attentive, focused team will also be vital to reducing significant stress-levels and enabling greater levels of workflow. Associate Professor and Director of the Robotic and Minimally Invasive Cardiac Surgery program at the University of Chicago Medicine, Dr Husam H Balkhy, has first hand experience of using wireless headsets in a surgical setting, he comments, “My ability to communicate quickly and effectively with other members of the robotic team including the table-side first assistant, the anaesthesiologist, the perfusionist and nursing staff, has led to increased efficacy and patient safety in these complex procedure.”

Balkhy isn’t the only one to have benefitted from these tools, Dr Ziv Tsafrir, a Fellow in Minimally Invasive Gynecology at Henry Ford, adds: “Using wireless headsets during robotic procedures certainly contributed to better patient outcomes by creating a calmer environment for clinicians and staff.”

The Next Steps

As patient demand grows, and the global use of EP and robotics surgery increases, wireless headset technology will be an essential companion to ensure optimal, efficacious and cost-effective communications. Combine this communication tool with the below practices and clinicians will be able to further enhance the surgical environment to not only create more effective workflows and treatment, but to increase positive patient outcomes.

  • Ensuring the surgical team have a focused team discussion prior to surgery to assign roles, establish expectations and anticipate outcomes, will enable each member of the team to be prepared for any scenario that may play out. This will be beneficial to the patient’s experience and will reduce the level of stress to a minimum.
  • Whilst a briefing before the operation is an extremely important part of the medical process, a debriefing post-op is just as vital. This discussion gives the whole team the opportunity to explore the problems that occurred during the procedure and how these can be overcome before the next operation.
  • Good communication is also vital outside the cath / EP lab and amongst the rest of the hospital staff. Lack of clarity about responsibility for care and decision-making is a major contributor to medical errors and could have an extremely negative impact on the operating room.
  • In a medical setting, the person who is supposed to act on information isn’t always clearly identified. Therefore, team members should communicate clearly, both at the beginning and throughout the operation, who this person is.

By working together and communicating clearly to one another before, during and after the procedure,  the stress levels of the entire surgical team and the patient can be significantly reduced.

About Tom Downes
Tom Downes founded Quail Digital in 1995 to design headset systems for ‘team’ communication. The philosophy being that the easier and more freely a team can speak with each other in the workplace, the better their outcomes, wellbeing and productivity. Quail Digital designs and manufactures systems for the healthcare, retail and hospitality sectors, and has offices in Dallas, TX and London UK. Quail Digital is the leading provider of communications systems in the OR, and a sponsor of Healthcare Scene.

Balancing Simplicity With the Exploding Challenges of Medical Device Security

Posted on December 3, 2018 I Written By

The following is a guest post by Gus Malezis, President and CEO of Imprivata.

The digitization of healthcare has allowed healthcare organizations to utilize robust technology such as network-connected medical devices to help improve both patient care and provider experience across the entire care continuum. Within this Internet of Medical Things (IoMT), medical devices can track and monitor patient stats, provide diagnostic information, help ensure lifesaving care delivery, and even make recommendations on treatment and clinical decision support – all while communicating directly with healthcare IT systems to ensure more complete and accurate patient medical records.

With these benefits of digitally connected medical devices, however, we now must consider and address a series of issues that are introduced with network connectivity and automated data integration; issues that relate to patient health and safety, cybersecurity, and compliance.

Simply put, advanced network-connected technology opens these devices to the risk of exploitation and compromised patient safety from both internal and external threats. Whether it’s an uninformed patient making changes to an unlocked infusion pump, someone stealing valuable protected health information (PHI) stored on an unattended device, or a cybercriminal using a network-connected medical device to gain backdoor access to a hospital’s entire network or disable the function of the devices (for the purpose of extracting ransomware), medical devices are now a source of risk for both healthcare organizations and patients. Compounding this issue is the fact that medical devices frequently run outdated operating systems and applications, all of which are difficult, or even impossible, to patch or otherwise protect with other standard security measures.

By 2020, the number of IoT devices is expected to reach 20.4 billion, and the number of IoMT devices is expected to reach 161 million. These numbers of incremental networked devices are truly staggering, which proportionally increases the risks of hacking, compliance, and health and safety. Clearly, healthcare IT can no longer afford to manage medical devices under current security protocols.

How locking down affects provider workflow

To address this threat and mitigate the risk posed by IoMT devices, organizations naturally look to implement security systems and tools that will safeguard the devices, enable only authorized personnel to interact and adjust/calibrate the devices, and safeguard access to patient records, clinical applications, and other sensitive data. Before implementing such solutions, however, healthcare organizations should consider several factors – particularly those relating to workflow.

Unlike other industries, healthcare can’t simply lock down information by building multi-layer security. Additionally, the focus is always on patient care, so minutes…and even seconds…truly matter, and clinicians need fast, unimpeded access to patient information. Layering in cumbersome security protocols has the potential to introduce new workflows, or create barriers to care. It is therefore critical that healthcare systems designers and architects consider several key factors when evaluating security options.

For starters, think about workflow integration: Any security tool should allow for optimal workflow efficiency among users, and that means the clinical staff and providers should not need to be “trained” on something new, or adopt a new workflow. Ideally, this means finding flexible and easy-to-use security tools that meet current existing workflows and preferences. Choosing easy-to-use options allows for security to be transparent so providers can focus on patient care, not on technology. For example, clinicians are accustomed to Tap-in and Tap-out (TITO) technology as a means of accessing HIT windows-based systems. This same workflow should be integrated and facilitated in anything new, thereby enabling secure and compliant access by utilizing a current and well known and adopted workflow. This is a win-win-win…the clinical staff win by using the same workflow, while IT, Cybersecurity, and Compliance teams also achieve their goals.

Another key factor is extensibility to other workflows: The need for security stretches across a number of different business and clinical workflows and applications. Healthcare organizations should look into a solution that provides the extensibility to meet all workflow needs, with the same consistent and transparent workflow model.

Addressing this challenge requires fast, efficient, and secure authentication for all devices that require security, including medical devices. For medical devices already requiring user authentication, appropriate security tools can improve efficiency by replacing the cumbersome manual entry of usernames and passwords with fast, automated authentication through the simple tap of a badge. Here we want to leverage the same consistent and transparent workflow model.

This way, organizations can optimize their use of interconnected medical devices to improve the delivery of care. They also maintain security and meet regulatory compliance requirements while ensuring efficiency for providers and giving them more time to focus on patient care.

Focusing on physical security and ID/Access control can enable the right balance — something that’s uniquely necessary in healthcare. A healthcare organization’s medical device access security plan should be part of a comprehensive identity and multifactor authentication platform for fast, secure authentication workflows across the healthcare enterprise. The medical device piece should combine security and convenience by enabling fast, secure authentication across enterprise workflows while creating a secure, auditable chain of trust wherever, whenever, and however users interact with patient records and other sensitive data.

As organizations are tuning in to the unique challenges of the IoMT era, it’s time to implement foundational security best practices with modalities that are tailored specifically to clinical workflows. Doing so achieves the balance necessary to ensure both security and flexibility.

About Gus Malezis
Gus Malezis is the President and Chief Executive Officer of Imprivata. Gus is widely recognized as a visionary leader in the information technology security industry where he brings more than 30 years of experience driving innovation and growth while building market leading organizations. Prior to joining Imprivata, Gus was most recently the President of Tripwire, a leading global provider of endpoint detection and response, security and compliance solutions. In his career, Gus has built a strong track record of delivering growth and innovation for leading technology and security companies such as Tripwire, McAfee, and 3Com.

Combatting Communication Problems in Community Healthcare Clinics

Posted on November 7, 2018 I Written By

The following is a guest blog post by Tom Downes, CEO of Quail Digital.

The notion of a community healthcare clinic is constantly evolving from the traditional model of a local clinic staffed by general practitioners and nurses, serving mainly rural populations. There is now a renewed interest in these organisations and their potential to deliver a more integrated care service within the community. However, in order to successfully make this transition, there is a need to better equip these clinics with the tools to ensure they’re able to cope with the extra demand and the ever-evolving medical treatments that are being practised.

With an estimated 33 million people visiting community healthcare clinics each year, these organisations are an essential part of the healthcare system. Whilst they are investing vital time into evolving their structure and delivering a focused range of medical services, without the right technology in place staff productivity will suffer, hindering their ability to make the most out of not only the current resources available, but any new, innovative resources they decide to invest in.

A collaborative approach

To foster a more productive, collaborative environment, communication should be implemented across the entire team. From diagnostics to preventive treatment, clinical procedure and rehabilitation, delivering a diverse set of services can create a stressful environment, if the team, from receptionist to clinicians, are wasting valuable time trying, without success, to communicate. But as services expand, enabling staff to speak easily with one another to seek answers to questions, locate the right individual and better manage the flow of patients through the appointments process, has become even more important.

Community healthcare clinics traditionally rely on telephones to communicate internally, but these can often go unanswered. Additionally, this device commonly only works when just two people want to communicate with each other, restricting the ability to send messages, updates and instructions to the whole team. Naturally, therefore, the likelihood of missing key information or mishearing a fellow colleague is increased, creating unnecessary stress and delays.

And this dated communication tool will not be able to facilitate the growing numbers of staff working in these clinics. As nearly 62 percent of all community healthcare clinics are in an urban setting they are providing services for extremely dense populations, therefore they require a greater amount of staff to help accommodate this demand. Team this up with the intense competition these urban clinics have with multiple clinics and medical centres serving the same geographic, and the need for a better communication tool that will help them provide a positive experience is even more important.

Clear Communication

Providing clear, discrete communication to all members at reception and in the clinics will have an extremely positive impact on the running of the community healthcare clinic. Lightweight headset technology will help the team working in these clinics to reduce unwanted hold-ups, improve workflow and offer a much improved experience for each of those patients who walk through the door. And with the ability to coordinate easily with one another, the team can become more productive and efficient to ensure they’re prepared for the demands felt by this expanding healthcare system.

Critically, in this most challenging of jobs, adopting a headset system that operates on a single channel will ensure all members of staff are in permanent communication. This way, doctors, nurses or receptionists are able to approach their colleagues who are working in another part of the clinic with any urgent query or question they may have. This immediate and non-obtrusive communication method is particularly important during times of expansion and innovation, as every team member will be learning and adopting new methods and structures.

Conclusion

Community healthcare clinics are evolving and there is now a growing need to implement digital solutions to provide staff with the ability to hear everything clearly, at all times. There are also other daily practices that can help facilitate a more tranquil environment. Along with headset technology, eliminating unnecessary, frantic noise across the clinic will drastically reduce the distractions all doctors, nurses and receptionists have to face. Not only will this have a positive impact on stress-levels, but it will also make it a lot easier to communicate effectively amongst the team. Daily team meetings are also vital for every member of staff in a community healthcare clinic. With a better understanding of everyone’s workload for that day the team will have greater visibility of who is available to assist with other tasks and enquiries.

By implementing communication tools and ensuring greater visibility across the team clinical operational efficiencies will be increased while staff stress levels will be reduced and their wellbeing improved.

About Tom Downes
Tom Downes founded Quail Digital in 1995 to design headset systems for ‘team’ communication. The philosophy being that the easier and more freely a team can speak with each other in the workplace, the better their outcomes, wellbeing and productivity. Quail Digital designs and manufactures systems for the healthcare, retail and hospitality sectors, and has offices in Dallas, TX and London UK. Quail Digital is the leading provider of communications systems in the OR, and a sponsor of Healthcare Scene.

Decommissioning Legacy EHRs

Posted on November 5, 2018 I Written By

The following is a guest blog post by Sudhakar Mohanraj, Founder and CEO, Triyam.

Every product has a lifecycle. The lifecycle of Electronic Health Record (EHR) software begins when it is implemented at your facility and ends when it’s no longer in use. When a facility decides to move to a new EHR, it’s natural to focus planning around the new software system.  However, not considering the legacy EHR can leave you wondering what should happen to all of the historical patient financial and medical data. You have many choices. This article will discuss some of the challenges and options that will influence your cost, legal compliance, and stakeholder satisfaction.

Three common mistakes to avoid when moving to a new EHR

  1. Hanging on to the legacy EHR

Some say: “we will worry about shutting down the old system later after the new EHR is up and going.” Taking that path is risky and expensive.

Consider the cost. Until you get all your historical data off the legacy system, you need to pay vendors licensing and support fees. You may infrequently be using the old system, which makes these fees particularly unwarranted.  In addition, you continue to pay your employees to operate and maintain the old system.

To learn more about retiring Legacy EHRs register for this free live webinar. Industry experts will share Key lessons and Best Practices on data management strategies for EHR system replacements. You can also get answers to your questions about any specific requirements.

Some say, “I will stop paying my old vendor.  I don’t need any more updates or support.” However, sooner or later, hardware and software will break or become incompatible to newer technology. Older systems are an easy target for hackers and thieves.

Over time, your employees will forget passwords, how to navigate the old system or leave for other jobs. Then, when you, a patient, or your boss needs some report from the old system, you are caught short. Over time, data retained on an old, unsupported, infrequently used system increases the risk of being lost, stolen, corrupted, and not accessible by newer technology.

Bottom line: keeping an old, infrequently used system will needlessly eat up your time and money.

  1. Migrating all historical data from the legacy system to the new EHR

Some facilities are surprised to learn that the new EHR vendor will not convert all the historical data to the new computer system.

The new system is organized differently than the legacy system with different data elements and structures. There is never a one-to-one match on data mapping between the old and new systems.

It is difficult to validate the accuracy and completeness of data you want to import from the old system. The new EHR vendor doesn’t want to risk starting with an inaccurate database.

This is a golden opportunity to start with a clean slate. For example, you can take this time to reorganize, re-categorize, re-word codes, and tables. Now is the time to set up master files properly, and to make the system more efficient.

The new EHR vendor will lobby for you to start with a clean slate and populate the new database with only current patients, current balances, and current information.

  1. Ignoring Legal Compliance Requirements

Federal and state laws require healthcare facilities to retain medical and financial reports for 5 to 15 years and make these reports available to patients and others upon request. Keeping these records will help to avoid penalties, fines, and loss of certifications. Consult your compliance office, accountant, and HIPAA director to know Federal, IRS, and state-specific requirements.

Use this Data retention tool to find the retention requirements for your state.

Why data archival is an excellent choice

What are the best practices to deal with historical data? Data from the old system needs to be organized in a safe, secure place so that the information can be found and made readily available to those who need it in a timely fashion. In other words, it needs to be archived.

An archive is a separate system from your new EHR. It contains all your historical data and reports. When users sign into the archive program, depending on their user rights, they may see all or some of the historical reports. The most common functions of the archive system include:

  • Search and query clinical and financial data for “Continuity of Care.
  • Download, view, print, and share reports for “Release of Information.

Archival is a new concept. KLAS research is creating a new product category for this.  Listen to this on-demand webinar from the head of EHR Archive studies at KLAS Research.

In the archive, you can see all patients and their previous charts, medications, treatments, billings, insurance claims, payments, and more.  You will also see the historical vendor, employee, and accounting records.

What type of data goes to the archive? All sorts. You can retain discrete data or non-discrete data, structured data (like SQL, XML, CCDA), or unstructured data that is logically grouped and presented in a human-readable form like pdf reports, Excel spreadsheets, CCD, jpeg, or mp3 files.

Mergers and data consolidation

Archival is essential even when there isn’t a transition to new EHR. During a merger, the new entity frequently wants to consolidate patient financial and clinical data from multiple legacy systems into a common platform. Data archiving may be the best solution for dealing with multiple EMR/EHRs. Archival is less expensive than complex conversion and transformation efforts. Besides lower costs, it allows users to research on consolidated data using business intelligence and analytics tools running on one common unified database.

Outsourcing and vendor selection

Outsourcing has become an increasingly popular option for archival solutions for three reasons – cost, experience, and convenience. IT managers are already stretched to limits of time, resources, and budget.  Outside vendors can save the day by offering services for less cost.

When searching for an archival vendor, consider the following:

  • Experience in extracting data from your legacy systems which are no longer supported
  • Complete turnkey solutions – planning, pilot testing, data conversion, user acceptance, and decommissioning
  • Archival product features and ease of use
  • Great customer references
  • Cost of archiving should only be a fraction of the cost of retaining legacy system

The number one failure when implementing a new EHR is procrastinating the archival of legacy data. Hopefully, you can use a few of these ideas to maximize the benefits of your historical data, minimize costs, and best serve your user constituents.

About Triyam
Triyam delivers expert solutions in EMR / EHR Data Conversion and Archival.

Triyam’s data conversion services help hospitals and clinics to freely migrate from one EHR vendor to another without losing any historical patient data. Triyam’s EHR archival product, Fovea is a vendor neutral, innovative and intuitive platform to store all your legacy data. Fovea includes a powerful search engine and extensive reporting for Business Intelligence and Analytics. Triyam is a proud sponsor of Healthcare Scene.

How to Build an Effective Rural Virtual Care and Telehealth Strategy

Posted on October 10, 2018 I Written By

The following is a guest blog post by Lee Horner, CEO of Synzi.

Rural healthcare organizations are increasingly interested in implementing virtual care and telehealth solutions in order to better meet the needs of their facilities, staff, and patient population. In danger of closing their doors, rural hospitals are struggling to survive and thrive in a healthcare environment with razor-thin margins.

iVantage’s 2017 Rural Relevance Study reports that 41 percent of rural hospitals operate at a negative margin. Poor financial performance is impacting these hospitals’ ability to keep their doors open and serve rural communities. In fact, the National Rural Health Association (NRHA) reported that the number of rural hospital closures has risen to 87 in the last 8 years.

A rural hospital closure has significant impact to its community. These facilities provide fundamental healthcare services to nearly 57 million people across the country and are often an integral part of the local economy, providing jobs and a tax base for the community. John Henderson, CEO of the Texas Organization of Rural and Community Hospitals (TORCH) stated that hospitals are a critical element of a town’s survival: “Hospitals, schools, churches. It’s the three-legged stool. If one of those falls down, you don’t have a town.”

Virtual care technology can be a viable delivery option for healthcare facilities and residents in rural communities. To best build an effective virtual care strategy, rural healthcare organizations should short-list solutions which solve for limited bandwidth in rural areas, patient preference for mobile devices and communications, an organization’s current infrastructure and workflow, and security concerns.

Addressing Bandwidth Issues: Rural healthcare organizations may initially think that limited Wi-Fi and broadband availability will restrict telehealth adoption by a facility, a medical practice and/or the patients themselves. However, rural healthcare organizations can identify and implement solutions which work across any level of connectivity (whether cellular or Wi-Fi) to ensure that the providers and the patients can use the solution without issues. Various entities are actively pushing for continued investment in our nation’s broadband infrastructure and rural communities are a priority for future build-out.

Reflecting Patient Preferences: Patients are already using many devices – including smartphones, tablets, and/or computers – which also provide them with more convenient access to healthcare without requiring significant travel time and costs. Moving forward, rural healthcare organizations should prioritize solutions which are device-agnostic and should also ensure their patient communications work across any type of modality. Providers and patients already own many of these devices; a flexible virtual care platform will help organizations and individuals reap more benefits out of the investments they have already made in technology.

Optimizing Current Workflows: Healthcare organizations have ongoing clinical workflows, and may be wary of technology’s role in automating these processes. However, rural healthcare organizations’ existing workflows can be optimized by using a virtual care platform which ensures that the virtual care protocols are consistent with in-person protocols in terms of engaging at-home patients and/or reaching offsite specialists for a needed consult. The ideal solution should be intuitive and easy to use; providers will then be able to quickly incorporate virtual care into their practices.

Addressing Security Concerns: When exploring new technology, most healthcare organizations will initially question if a net-new solution meets safety and privacy standards. Rural healthcare organizations should prioritize solutions which are HIPAA-compliant and HITRUST-certified to ensure security, privacy and compliance. Although rural health providers will immediately understand the need to adopt a virtual care platform, IT departments and champions will also need to realize that the adoption of this new technology will benefit providers, patients, and ultimately, the sustainability of the healthcare organization. Virtual care technology is essential to rural healthcare as it helps close the time and distance gap in terms of providing patients with the care they need, when they need it – regardless of where the patients or the providers are located.

The rural population has noted gaps in both access and quality. An estimated one in five Americans live and work in rural areas across the nation, yet, there are 2,157 Health Professional Shortage Areas in rural areas compared to 910 in urban areas. Moreover, the Rural Health Information Hub reports that 19.5 percent of rural adults describe their health status as fair/poor vs. 15.6 percent of their urban counterparts. Virtual care technology can help address the gap in care by providing access to additional physicians and needed specialists at the click of a button. By leveraging external and/or associated hospitals and physician groups, rural hospitals strengthen their care within the vast populations and geographies they support.

Top 5 Ways Healthcare Applications Slow Down and What To Do About It

Posted on October 4, 2018 I Written By

The following is a guest blog post by Jeff Garbus and  Alvin Chang from Soaring Eagle Consulting.

We spend a lot of our lives tuning applications that people complain are too slow. In no particular order, here are some of our findings.

Poor indexing #1 – Unused Indexes, Missing Indexes can cause problems

While I’ve said, “in no particular order,” I do have to say this one is usually first. When applications go through Q/A / Stress test, there is often a lot more horsepower than there is data. As a result, the memory and CPU combination mask the otherwise bad performance. Once the application hits production, larger volumes of data are not managed as effectively.

On the plus side, you can almost always add an index (or indexes) without causing other application side effects.

Warning: Do NOT automatically add indexes as recommended by a DBMS’ tuning advisor; they often miss opportunities, and also often significantly over index by recommending multiple similar indexes rather than one enveloping one.

Be wary of overindexing as too many indexes can also create overhead that will cause processes to slow.

Bad queries #2 – Too much data returned by a query

Sometimes you are simply bringing too much data back from the database to the front end. I saw a search recently that brought about a half million rows of data back to the end user. I asked, “What is the user going to do with that much data?” Answer: “They are going to look at the first few rows and refine the search.”

This unnecessarily stresses the disk CPU, memory, and the network.

Easiest solution: Bring back only the data the user is going to work with. Perhaps the first few hundred rows. Save time, disk resources, and network resources.

Bad queries #3 – Overuse of temporary tables

Many applications use temporary tables incorrectly or are wasteful with them. For example, they are used

  • When the programmer wants to avoid joins (which the server is very good at!);
  • Are filled with lots of data, then rows are deleted (why load them in the first place?);
  • Or too many columns are used (why select * when the columns aren’t being used?) – this increases network bandwidth, as well as making the table unnecessarily big
  • Joining temp tables is another way developers often misuse server resources. Without indexes, this is very costly

Avoid temporary tables

Bad Queries #4 – Attempting to do it all in one Giant Query. 

Sometimes the opposite can also be true. When attempting to write a query for a process, Developers can get stuck in the mindset that a single query can solve all possible conditions of a query.  This leads to large complicated queries that in addition to being difficult to decipher. Can also generate excessive numbers of worktables as it attempts to place large subsets of data into worktables.

Large Reports #5 Combine reporting and transactional activity

It is very common to allow reporting off highly transactional databases. The problem is that reporting creates shared locks on resources, and transactions can not modify the data while the locks are held. In addition, reports are often ad hoc, so that the load on the server is unpredictable.

Easy solution: replicate production data to a reporting server. If replication or other high availability is unavailable, use dump/load to keep day old data for reporting purposes (this is often sufficient).

Allow direct downloads of data

Some companies allow “super users” (also sometimes called “analysts”) to download production data, real time, to applications like Microsoft Access. In addition to being a likely security violation, this also creates blocking issues for the online users.

Solution: Data replication, as above.

If you’d like to learn more about how to improve slow applications, sign up for our webinar “Are your Servers, Apps, and EHR systems ready for a spike in website traffic?

About Jeff Garbus and Alvin Chang
Jeff Garbus founded Soaring Eagle Consulting 20 years ago, and Alvin has been his right hand for almost 30 years now. Together they have authored or coauthored 20 books and dozens of articles on Database Management. Soaring Eagle Consulting is an On Shore HIPPA and PCI compliant remote database management company that is available for projects and consulting work on Architecture, Performance and Tuning, Scalability, application development, migrations and 24×7 full operational support. Do your DBAs need a best friend? Jeff, Alvin, and the On Shore GURU level database team are here to help you!

Soaring Eagle is a proud sponsor of Healthcare Scene.

HIPAA Breach Investigations – What You Should Know

Posted on September 5, 2018 I Written By

The following is a guest blog post by Moazzam Adnan Raja, Vice President of Marketing at Atlantic.Net.

Correctly handling a HIPAA breach recovery will benefit from a well-prepared and systematic approach. Investigation is one of a few key elements to consider, alongside speed, notification, and risk assessment. The specific issue of time deserves closer examination, as does the incorporation of risk management and auditing processes.

4 pillars of HIPAA breach response

Here are four key elements or pillars of a strong HIPAA breach response, a framework provided by Brach Eichler healthcare attorney Lani M. Dornfeld, that can be helpful in guiding your own response, as well as setting expectations with your healthcare hosts and other business associates:

Speed – Moving rapidly in response to a breach is fundamental to limiting the damage. Put together an investigation and response team, which should include the HIPAA security officer and HIPAA privacy officer, along with an attorney as necessary. You may want to standardly include your attorneys, along with members of a HIPAA compliance committee, if your organization is larger and requires more sophisticated oversight. The board of trustees and board of directors could also be included.

Investigation – The way that an investigation is conducted will depend heavily on the nature and scope of the breach. There is, of course, the issue of responsibility to patients but also liability to the organization. For the latter, Dornfeld noted, “If cloaking the investigation in the attorney-client privilege will be to your strategic advantage, then you will need to be counseled about how to manage the flow of information to maintain the privilege.” Breaches often occur because of internal errors by your staff, such as disclosure without proper authorization (e.g., telling a friend confidential patient information) or accidental disclosure to the incorrect party (e.g., sending a letter to the wrong address). Incredibly, insiders are responsible for more than half (58%) of healthcare breaches impacting electronic protected health information (ePHI), per a study released in March by Verizon. When breaches occur due to the insider threat, at the minimum, you want to conduct private interviews with relevant parties, with another person there to assist in asking questions and determining perceived honesty. Beyond what you are able to glean from interviews, it will also help to get any supporting evidence – which may include copies of social media posts, letters, or emails, as well as information from the data system. (Related to investigation, see the discussion of time below.)

Notification – Letting all pertinent parties know about healthcare data breaches is critical. Notification should occur quickly and always within 60 days of breach discovery (unless advised by law enforcement that notification would problematize its own investigations), per the Breach Notification Rule. When you notify patients or others that ePHI has been exposed, your communications should be clearly worded. They should mention the specific data involved (such as lab results or Social Security numbers) and the steps the company is taking toward investigation and mitigation. It should also let the patient know what protective steps they can personally take, along with how to get further details or ask questions.

Risk assessment – After the investigation is finished, you and the legal team can use the insight from it, along with whatever you have already done toward mitigation, to conduct a HIPAA-compliant risk assessment. The risk analysis parameters from the HHS explain that a full assessment should be conducted related to any threats to the availability, integrity, and confidentiality of health data. The HHS notes that the risk analysis is an important basis of information since it can be used to guide what is considered a “reasonable and appropriate” step (the determining factor for a HIPAA-compliant approach). While HIPAA is flexible on many parameters, it mandates that risk assessments be performed routinely (related to all ePHI systems) when contracting with new business associates (related to that specific information), and when security incidents occur (related to that specific information). Any access to ePHI that is disallowed by the Privacy Rule’s subpart E must be disallowed. Any time at which health data is accessed or used in a way that is noncompliant with those guidelines will be assumed breaches – except if your risk assessment can show that there is, in fact, low likelihood of a compromise. (Related to risk assessment, see the section on risk management and audits below.)

The specific issue of time

Time should be central to investigations, as indicated by Mayer Brown healthcare attorney Laura Hammargren. There is disagreement over whether the moment of discovery of a breach should be considered the moment when you reveal a potential breach or the moment when you have finished assessing the situation and understand what occurred.

While there may still be some debate related to discovery, the law is clear at least on the parameter of 24 hours. Discovery of a breach of ePHI occurs “as of the first day on which the breach is known to the organization, or, if exercising reasonable diligence would have been known to the organization,” noted Dornfeld.

Security events are common in which it is unclear if data was compromised or not. It can take a significant amount of time to confirm whether a breach occurred, and exactly how it might have occurred. Some means of assault are incredibly complex. Attackers may make it extraordinarily challenging to track their moves – in which case it can be a painstaking task to find out the data that they possibly accessed and removed.

Another concern of a HIPAA breach investigation is figuring out the length of time the intruder had access, which can have a huge influence on the breadth of the breach.

Risk management & audits

The risk assessment is part of the larger picture of risk management. When you are approaching a healthcare data breach investigation, you will benefit from comprehensive risk management and auditing processes. Through these safeguards, you will be much readier to send out notifications promptly, as well as to give clear information to police and other law enforcement officials.

Risk management is simplified when you have strong business associate agreements (BAAs), through which your standards can extend to third parties. By working with established, next-generation, HIPAA compliant cloud storage provider, you will have peace-of-mind that risks are properly controlled, backed by third-party certifications and audits.

Atlantic.Net is a proud sponsor of EMRandHIPAA.com. Atlantic.Net provides HIPAA compliant hosting, backed by 100% uptime guarantee.

About Moazzam Adnan Raja
Moazzam Adnan Raja has been the Vice President of Marketing at Atlantic.Net for 14 years. During Raja’s tenure, the Orlando-based, privately held hosting company has grown from having a primarily regional presence to garnering and developing attention nationwide and internationally. In collaboration with a skilled and dedicated team, Raja has successfully led a full spectrum of marketing campaigns, as well as handling PR work with major news outlets and the formation of key strategic alliances.

A Caregiver’s Perspective on Patient Engagement

Posted on August 20, 2018 I Written By

The following is a guest blog post by Michael Archuleta, Founder and CEO of ArcSYS, where he shares his experience as a caregiver for his father trying to navigate the healthcare system.

My dad is 99 years old. Having moved him to Utah 6 months ago into a retirement home, our first step was to get an appointment with a new primary care physician. I brought along a list of his medications and watched the nurse tediously look up and enter each into the EHR. Dad and the doctor got along great on that first visit. She assured us that she could help manage his medications. There was nothing realistically that could be done to really improve quality of life. When you’re 99, you’re stuck.

Around the middle of March Dad noticed blood and clots in his urine. Off to the primary care provider we went. They took a sample of urine, tested it, and there was no sign of an infection. Maybe we should look up a specialist in urology. A referral was given and a few days later the urology practice contacted us to make an appointment. Dad declined.

He didn’t want to see another doctor. Period. But day by day, the blood was always present in the urine. He started to worry and finally relented to going to the urologist. Off to the new doctor. Oh, yes, I brought along the list of medications and watched another nurse go to the process of keying them in.

The next day, I got an email via Updox saying there was a message from Dad’s doctor. Updox?? Really?? That was pretty cool. After being on the front end where our EMR system (Red Planet) uploads everything, this was interesting to see how another EMR system was employing Updox. Sure enough, there was the urologist’s note that had been completed 3 hours after the appointment. But, as I read it, I couldn’t help feel a little disappointed. A boilerplate. Since I had been in the room, I knew what was asked. Some questions were never asked and obviously inferred. Maybe a minor point, but I knew it. Anyway, the recommendation was to get an ultrasound. Off to another provider!

Within one day another message alert came from Updox. On logging into the Updox account, there was the report from radiology. Good news, nothing out of the ordinary.

A week passed and it was back to the urologist for a cystoscopy. I was in the room with Dad while the doctor performed the procedure. “Want to see this tumor?” the doctor asked me. “Sure.” I replied. Through the scope I could see a dark mass on the wall of the bladder. The recommendation was to perform surgery to remove the mass and biopsy it.

Another alert came through within a day via Updox. Still the same boilerplate style with default answers. Oh well, if nothing else it was timely.

On May 21 the procedure was done at an outpatient surgical facility. This time I was lucky: No one had to enter the list of medications. From here, unfortunately, things started to go downhill. Dad was left with a catheter and a bag which became his (our) buddy for 10 days. The unfortunate thing was being confined to his room. He could (would) not walk to the dining room at the retirement facility for his meals. So the meals were brought to him each day in a white clam shell styrofoam container. One piece of good news was delivered via Updox, the biopsy was benign.

Once the catheter was removed, he could be mobile, but was too weak to walk. He languished in his room. I coaxed him to try walking. No result. Others in my family encouraged him with the same non-result. I finally took him back to the primary care doctor. One look at him, and she noticed that the spark of life had been extinguished. She took me aside and asked if she needed to play hard ball with him. “You bet” was my response. In a firm way she told Dad that if he didn’t start walking he was going to be dead in 3 months.

That was the trick. Dad was furious that a doctor would be so “unprofessional” as to say anything like that. As soon as we arrived at the retirement home he pushed his walker half way down the hallway just to prove he could walk just fine, thank you. (Mission accomplished.)

But when you’re 99, the body just doesn’t really get better. There was still blood and clots, but were told that would be expected. A couple of weeks later he calls me to say he was in excruciating pain and can’t pee. By the time I arrive the pain was so bad I need to get a hold of the paramedics. They show up in 5 minutes and whisk him to the ER.

Fortunately, the ER has his list of medications so I’m spared having to go through that process. The doctor on call briefly examined him and turned control over to the nurse. A few hours later we have our “friends” the catheter and bag and head home. At least he was committed to walking to the dining room.

A couple of weeks pass and I received a phone call from the paramedics who inform me that Dad had a fall on his way to breakfast. They are transporting him to the ER. He was diagnosed at the ER with a bladder infection and they are concerned about his cardiac functions. Lab results also indicate e. coli and sepsis. Since they don’t have an on-site cardiologist, he was transferred to another hospital and admitted. And, yes, we have to go through the whole list of medications there because they don’t have access to that information? Go figure.

He hated the hospital. There was no rest. Every hour someone was taking vitals, getting him up, doing this, doing that. He was desperate for sleep and rest. At discharge, the cardiologist gave me explicit verbal instructions to take him off his Furosemide. She also gave orders for home nursing and physical therapy.

Whew. He was back home but again too weak to walk to the dining room. The Updox report came through and the written instructions by the cardiologist tell him to continue all meds including Furosemide. Really? Did she forget what she told me. Did she not take her own notes? The nurse showed up at his apartment, took lots of notes, asked lots of questions and examined him. Hmm. Concerned about the swelling in his feet and ankles. It was bad. We confer and decided the Furosemide needed to be restarted. The nurse reached out to the PCP who concurred.

Over the next 3 weeks the swelling slowly receded. The nurse and physical therapist helped him but the improvement was ever so slow.

What I have experienced was a medical world of silos. Each health care provider focused on just what they do. The urologist was pleased with surgery and how well it turned out. But he didn’t have to deal with 3 months of bags, styrofoam meals, ER visits, depression and hospitalizations. None of the doctors conferred with each other about the best treatment. The number of times I filled out past medical histories was finger-numbing. The written documentation didn’t accurately match what took place or what was verbally instructed. The cardiologist was adamant about the meds which would be best for his heart. Within each silo the people were very kind, compassionate, caring and professional. But, the EHR systems just seemed to get in the way of real care. Yes, INDIVIDUALLY, everything was working, but PEOPLE and their SYSTEMS were not interacting to solve the problem.

On the up-side, not one out-of-pocket penny was spent by way of the Medicare Advantage plan. Insurance and billing performed flawlessly. A little over $65,000 was billed and $12,000 was paid.

Clearly, providing health care is not easy. Maybe things should have been done differently. This was a relatively simple issue, but there was no clear direction. Will any healthcare administrator ever be aware of this situation? Probably not. Will any insurance company ever study this case? Doubtful. In hindsight, it would have been just as easy for me to pass out copies of medications and histories and have people tape them to the wall. A few phone calls between providers would certainly have come up with a better solution. But here we are down the road and Dad is not a happy camper.

Is anybody listening?