Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

A HIPAA Life Sentence… and SO Many Lessons

Posted on November 15, 2018 I Written By

Mike Semel is a noted thought leader, speaker, blogger, and best-selling author of HOW TO AVOID HIPAA HEADACHES . He is the President and Chief Security Officer of Semel Consulting, focused on HIPAA and other compliance requirements; cyber security; and Business Continuity planning. Mike is a Certified Business Continuity Professional through the Disaster Recovery Institute, a Certified HIPAA Professional, Certified Security Compliance Specialist, and Certified Health IT Specialist. He has owned or managed technology companies for over 30 years; served as Chief Information Officer (CIO) for a hospital and a K-12 school district; and managed operations at an online backup company.

In 2012 Accretive Health Care was banned from doing business in Minnesota for 2 – 6 years for a HIPAA violation.

In 2018 New York State suspended a nurse’s license for a year for a HIPAA violation.

But, a life sentence?

The New Jersey Attorney General announced a $ 200,000 HIPAA and consumer fraud penalty against an out-of-business Georgia medical transcription company. In 2016 ATA Consulting LLC d/b/a Best Medical Transcription breached the medical records of over 1,650 people treated by three New Jersey healthcare providers by publicly exposing their medical records to the Internet. And, their customer, Virtua Health, paid a $ 418,000 settlement for violations of both HIPAA and the New Jersey Consumer Fraud Act.

Tushar Mathur, owner of Best Medical Transcription, agreed to a permanent ban on managing or owning a business in New Jersey.

Wow.

A life sentence for a HIPAA violation.

And the medical clinic paying a $ 418,000 penalty for the actions of its vendor.

By a state, not the federal government.

What can you learn from this?

1. It’s shocking to see how many servers have been misconfigured, or protected data being stored on web servers, exposing patient records to the Internet. These HIPAA penalties were all for exposing patient records through the Internet:

LESSONS –

  • Have your servers installed by a certified professional using a detailed checklist to ensure that no data is exposed to the Internet.
  • Make sure your organization has enough data breach insurance to cover millions of dollars in penalties; that you live up to all the requirements of your policy; and that you consistently implement the security controls you said you have in place on your insurance application.
  • Make sure your outsourced IT provider has enough Errors & Omissions insurance to cover your penalties

2. Many doctors and business owners tell me that “the federal government will never get them” or that they are “too small to be of interest” to federal regulators.

LESSONS –

  • Regulators go after small businesses, which doesn’t always make headlines. The Federal Trade Commission forced a 20-employee medical lab to go out of business. The business owner fought the FTC and ultimately won in court, but his business was gone.
  • Don’t ignore your risk that your state Attorney General (who probably wants to be governor) wants by getting headlines about protecting consumers. The HITECH Act (2009) gave state Attorneys General the authority to enforce HIPAA. Violations also can be tied to consumer protection laws, not just HIPAA.
  • Lawyers are representing patients whose information was released without authorization. Patients have successfully sued doctors for HIPAA violations.
  • Doctors shouldn’t laugh off HIPAA or just complain (INCORRECTLY) that it interferes with patient care. A doctor went to jail for a HIPAA violation.

3. HIPAA is only one regulation with which you must comply.

LESSONS –

  • Don’t think that a ‘We Make HIPAA Easy’ web-based solution is enough to protect your assets from all your regulatory challenges.
  • Don’t think that a self-conducted Security Risk Analysis is a substitute for a professionally-designed HIPAA compliance program that will meet all the federal and state requirements you must follow.
  • Don’t think that an IT Security company doing a vulnerability or penetration test is a substitute for a HIPAA Security Risk Analysis or a robust compliance program.
  • Every state now has data breach laws the state Attorneys General love to enforce. These consumer protection laws protect Personally Identifiable Information (PII) held by medical practices. State laws have different requirements than HIPAA. For example, HIPAA requires that patients be notified no later than 60 days after a data breach. California requires just 15 days.
  • Because of the opioid crisis, many types of medical practices are now offering substance abuse treatment, which requires additional confidentiality measures. So do HIV, mental health, and STD treatments. You need to address all the regulations that apply to you.

4. Don’t blindly trust your vendors.

LESSONS –

  • Signing a Business Associate Agreement (BAA) isn’t evidence that your vendor really complies with HIPAA. According to the NJ Attorney General, Best Transcription signed a BAA with Virtua Health but:
  • Failed to conduct an accurate and thorough risk assessment of the potential risks and vulnerabilities to the confidentiality, integrity, and availability of ePHI it held;
  • Failed to implement security measures sufficient to reduce risks and vulnerabilities to a reasonable and appropriate level to comply with the Security Rule;
  • Failed to implement policies and procedures to protect ePHI from improper alteration or destruction;
  • Failed to notify VMG of the breach of unsecured PHI; and
  • Improperly used and/or disclosed ePHI in contravention of its obligations under its Business Associate Agreement with VMG.

Make sure your vendors understand their HIPAA obligations. Even after five years, my experience is that many Business Associates have failed to keep up with the changes required by the 2013 HIPAA Omnibus Final Rule. Many talk about HIPAA in their sales and marketing but do not comply.

Remember that you are responsible for the actions of your vendors.

WHEN YOU ARE LYING AWAKE TONIGHT, ASK YOURSELF:

  • Are you really sure you can survive an investigation by your state attorney general?
  • Are you really sure your Business Associate vendors have conducted a HIPAA risk analysis; have implemented HIPAA security measures; have implemented HIPAA policies and procedures, are really protecting your PHI, and will notify you if there is a breach?
  • Are you willing to bet $ 418,000 (what Virtua paid) on it?
  • If you are a Business Associate, what do you think it will feel like if you are banned for life from doing business?

Doctors send patients to specialists all the time. Whether you are a medical provider or a vendor, do you have the trained and certified specialists you need that can help with all your regulatory challenges? Does your team need expert help to validate what is you and your vendors are doing and help you address any gaps?

Don’t risk your assets. Don’t risk a life sentence.

 

 

Regulatory Heat: Is Your BAA House in Order?

Posted on August 9, 2018 I Written By

The following is a guest blog post by Greg Waldstreicher, Founder and CEO of PHIflow.

Actions by the Office for Civil Rights (OCR) have clearly demonstrated stricter enforcement of HIPAA rules in recent years, specifically upping the ante on compliance with business associate agreements (BAAs). Much of this activity can be attributed to a grim outlook on security risks: globally, 70% of healthcare organizations have suffered a data breach, and a recent Ponemon Institute report found that the vast majority have experienced multiple security incidents involving protected health information (PHI).

BAAs play an important role in security as the framework by which an organization ensures that any vendor creating, receiving, maintaining or transmitting PHI complies with HIPAA. In recent years, these contracts have come under increased scrutiny amid high-level audits launched by OCR. Mismanagement of BAAs have thus far resulted in penalties ranging from $31,000 for simply not having a BAA in place to upwards of $5.5 million for more serious offenses.

While the stakes are high, healthcare organizations often lack effective oversight strategies for these important patient protection tools. In fact, it’s not uncommon for even the most basic information to elude the executive suite such as:

  • the number of BAAs that exist across an enterprise
  • where BAAs are located
  • the terms of each BAA

In an industry that has witnessed a significant uptick in security incidents and breaches in recent years, this current state of affairs is less than optimal. In truth, the reach of recent audit activity is still an unknown as the healthcare industry awaits full disclosure and recommendations from OCR. One of the latest OCR settlements —$3.5 million levied against Fresenuis Medical Care North America—resulted from multiple incidents that occurred in 2012, underscoring the lengthy timeframe associated with finalizing investigations and legal processes.

All told, current trends point to the need for better oversight and management of BAAs. While penalty activity subsided some in recent months as OCR went through internal transitions, industry legal experts expect that investigative momentum will continue to increase in proportion to heightened security risks across the healthcare landscape.

Unfortunately, healthcare organizations face notable roadblocks to getting their BAA house in order. Amid competing priorities, many simply lack the resources for tracking these agreements. Health systems are increasingly multi-faceted, and current trends associated with mergers, acquisitions and consolidations only exacerbate the challenge. The reality is that some large organizations have as many as 10,000 BAAs across the enterprise. Because these agreements are typically spread across multiple departments and facilities and have a multitude of different owners, managing them in a strategic way via manual processes is nearly impossible.

In tandem with the internal resource challenge, the language contained in BAAs has become significantly more complicated due to not only a fluid and evolving regulatory environment, but also the vital role they play in an overall security strategy. While a simple, cookie-cutter approach to these agreements was fitting a decade ago, BAAs are now intensely negotiated between covered entities and business associates and between business associates and sub-business associates, often involving HIPAA attorneys and resulting in requirements that go beyond HIPAA and HITECH. Subsequently, the terms of each BAA across an organization may vary, making efficient and effective management extremely difficult.

The good news is that there is a relatively simple solution—automated management of BAAs. The right technological framework can lay the foundation for timely access to all contracts across an enterprise, improving compliance and ensuring readiness for audits or breach response. Once consolidated, artificial intelligence can then be applied to BAAs to draw actionable insights in near real-time, informing key personnel of the key terms across all agreements.

The healthcare industry at large has drawn heavily on the promise of automation and data analytics in recent years to power more efficient and effective processes. Management of BAAs is no different and is an area ripe for improvement. Today’s healthcare executives need to consider the high stakes associated with ineffective management of BAAs and take action to shore up strategies amid greater security risks and a challenging regulatory environment.

About Greg Waldstreicher
Greg Waldstreicher is the founder and CEO of PHIflow, and the cofounder and former CEO of DoseSpot, where he worked at the forefront of the electronic prescribing (e-Prescribing) market for nine years. Under Greg’s leadership, DoseSpot licensed its SaaS e-Prescribing solutions to 175 healthcare software companies across the medical, dental, hospice and digital health markets. Greg received a B.S. from the University of Maryland College Park in Accounting and an M.S. from Northeastern University in Technological Entrepreneurship.

How the Young Unity Health Score Company Handles The Dilemmas of Health IT Adoption

Posted on June 25, 2018 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

I have been talking to a young company called Unity Health Score with big plans for improving the collection and sharing of data on patients. Their 55-page business plans covers the recruitment of individuals to share health data, the storage of that data, and services to researchers, clinicians, and insurers. Along the way, Unity Health Score tussles with many problems presented by patient data.
Unity Health Score logo
The goals articulated for this company by founder Austin Jones include getting better data to researchers and insurers so they can reduce costs and find cures, improving communications and thus care coordination among clinicians and patients, and putting patients in control of their health data so they can decide where it goes. The multi-faceted business plan covers:

  • Getting permission from patients to store data in a cloud service maintained by Unity Health Score
  • Running data by the patients’ doctors to ensure accuracy
  • Giving patients control over what researchers or other data users receive their data, in exchange for monetary rewards
  • Earning revenue for the company and the patients by selling data to researchers and insurers
  • Helping insurers adjust their plans based on analysis of incoming data

The data collected is not limited to payment data or even clinical data, but could include a grab-bag of personal data, such financial and lifestyle information. All this might yield health benefits to analytics–after all, the strategy of using powerful modern deep learning is being pursued by many other health care entities. At the same time, Jones plans to ensure might higher quality data than traditional data brokers such as Acxiom.

Now let’s see what Unity Health Score has to overcome to meet its goals. These challenges are by no means unique to these energetic entrepreneurs–they define the barriers faced by institutions throughout health care, from the smallest start-up to the Centers for Medicare & Medicaid Services.

Outreach to achieve a critical mass of patients
We can talk for weeks about quality of care and modernizing cures, but everybody who works in medicine agrees that the key problem we face is indifference. Most people don’t want to think too much about their health, are apathetic when presented with options, and stubbornly resist the simplist interventions–even taking their prescribed medication. So explaining the long-term benefits of uploading data and approving its use will be an uphill journey.

Many app developers seek adoption by major institutions, such as large insurers, hospital conglomerates, and HMOs like Kaiser. This is the smoothest path toward adoption by large numbers of consumers, and Unity Health Score includes a similar plan in its business model, According to Jones, they will require the insurance company to reduce premiums based on each patient’s health score. In return, they should be able to use the data collected to save money.

Protecting patient data
Health data is probably the most sensitive information most of us produce over our lifetimes. Financial information is important to keep safe, but you can change your bank account or credit card if your financial information is leaked–you can’t change your medical history. Security and privacy guarantees are therefore crucial for patient records. Indeed, the Unity Health Score business plan cites fears of privacy as a key risk.

Although some researchers have tried distributed patient records, stored in some repository chosen by each individual, Unith Health Score opts for central storage, like most current personal health records. This not only requires great care to secure, but places on them the burden of persuading patients that the data really will be used only for purposes chosen by the patients. Too many apps and institutions play three-card Monte with privacy policies, slipping in unauthorized uses (just think back to the recent Facebook/Cambridge Analytica scandal), so Internet users have become hypervigilant.

Unity Health Score also has to sign up physicians to check data for accuracy. This, of course, should be the priority for any data entered into any medical record. Because doctors’ time is going more and more toward the frustrating task of data entry, the company offers an enticing trade-off: the patients takes the time to enter their data, and the doctor merely verifies its accuracy. Furthermore, a consolidated medical record online can be used to speed check-in times on visits and to make data sharing on mobile devices easier.

Making the data useful
Once the patients and clinicians join Unity Health Score, the company has to follow through on its promise. This is a challenge with multiple stages.

First, much of the data will be in unstructured doctors’ notes. Jones plans to use OCR, like many other health data aggregators, to extract useful information from the notes. OCR and natural language processing may indeed be more accurate than relying on doctors to meticulously fill out dozens of structured fields in a database. But there is always room for missed diagnoses or allergies, and even for misinterpretations.

Next, data sources must be harmonized. They are likely to use different units and different lexicons. Although many parts of the medical industry are trying to standardize their codings, progress is incomplete.

The notion of a single number defining one’s health is appealing, but it might be too crude for many uses. Whether you’re making actuarial predictions (when will the individual die, or have to stop working?), estimating future health care costs, or guessing where to allocate public health resources, details about conditions may be more important than an all-encompassing number. However, many purchasers of the Unity Health Score information may still find the simplicity of a single integer useful.

Making the service attractive to data purchasers
The business plan points out that most rsearch depends on large data sets. During the company’s ramp-up phase–which could take years–they just won’t have enough patients suffering from a particular condition to interest many researchers, such as pharma companies looking for subjects. However, the company can start by selling data to academic researchers, who often can accomplish a lot with a relatively small sample. Biotech, pharma, and agencies can sign up later.

Clinicians may warm to the service much more quickly. They will appreciate having easy access to patient data for emergency room visits and care coordination in general. However, this is a very common use case for patient data, and one where many competing services are vying for a business niche.

Aligning goals of stakeholders
In some ways I have saved the hardest dilemma for last. Unity Health Care is trying to tie together many sets of stakeholders–patients, doctors, marketers, researchers, insurers–and between many of these stakeholders there are irreconcilable conflicts.

For instance, insurers will want the health score to adjust their clients’ payments, charging more for sick people. This will be feared and resented by people with pre-existing conditions, who will therefore withhold their information. In some cases, such insurer practices will worsen existing disparities for the poor and underpriviledged. The Unity Health Score business plan rejects redlining, but there may be subtler practices that many observers would consider unethical. Sometimes, incentives can also be counterproductive.

Also, as the business plan points out, many companies that currently purchase health data have goals that run counter to good health: they want to sell doctors or patients products that don’t actually help, and that run up health care costs. Some purchasers are even data thieves. Unity Health Score has a superior business model here to other data brokers, because it lets the patients approve each distribution of their data. But doing so greatly narrows the range of purchasers. Hopefully, there will be enough ethical health data users to support Unity Health Score!

This is an intriguing company with a sophisticated strategy–but one with obstacles to overcome. We can all learn from the challenges they face, because many others who want to succeed in the field of health care reform will come up against those challenges.

MD Anderson Fined $4.3 Million For HIPAA Violations

Posted on June 21, 2018 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

An administrative law judge has ruled that MD Anderson Cancer Center must pay $4.3 million to the HHS Office of Civil Rights due to multiple HIPAA violations. This is the fourth largest penalty ever awarded to OCR.

OCR kicked off an investigation of MD Anderson in the wake of three separate data breach reports in 2012 and 2013. One of the breaches sprung from the theft of an unencrypted laptop from the home of an MD Anderson employee. The other two involved the loss of unencrypted USB thumb drives which held protected health information on over 33,500 patients.

Maybe — just maybe — MD Anderson could’ve gotten away with this or paid a much smaller fine. But given the circumstances, it was not going to get away that easily.

OCR found that while the organization had written encryption policies going back to 2006, it wasn’t following them that closely. What’s more, MD Anderson’s own risk analyses had found that a lack of device-level encryption could threaten the security of ePHI.

Adding insult to injury, MD Anderson didn’t begin to adopt enterprise-wide security technology until 2011. Also, it didn’t take action to encrypt data on its devices containing ePHI during the period between March 2011 and January 2013.

In defending itself, the organization argued that it was not obligated to encrypt data on its devices. It also claimed that the ePHI which was breached was for research, which meant that it was not subject to HIPAA penalties. In addition, its attorneys argued that the penalties accrued to OCR were unreasonable.

The administrative law judge wasn’t buying it. In fact, the judge took an axe to its arguments, saying that MD Anderson’s “dilatory conduct is shocking given the high risk to its patients resulting from the unauthorized disclosure of ePHI,” noting that its leaders “not only recognized, but [also] restated many times.” That’s strong language, the like of which I’ve never seen in HIPAA cases before.

You won’t be surprised to learn that the administrative law judge agreed to OCR’s sanctions, which included penalties for each day of MD Anderson’s lack of HIPAA compliance and for each record of individuals breached.

All I can say is wow. Could the Cancer Center’s leaders possibly have more chutzpah? It’s bad enough to have patient data breached three times. Defending yourself by essentially saying it was no big deal is even worse. If I were the judge I would’ve thrown the book at them too.

Why You Shouldn’t Take Calculated Risks with Security

Posted on May 9, 2018 I Written By

The following is a guest blog post by Erin Gilmer (@GilmerHealthLaw).

Calculated risks are often lauded in innovation.  However, with increasing security breaches in the tech industry, it is time to reassess the calculated risks companies take in healthcare.

Time and again, I have advised technology companies and medical practices to invest in security and yet I am often met with resistance, a culture of calculated risk prevails.  To these companies and practices, this risk may make sense to them in the short term. Resources are often limited and so they often believe that they needn’t spend the time and money in security.  However, the notion that a company or a practice can take this chance is ill advised.

As a recent study conducted by HIMSS (and reviewed by Ann Zieger here) warns, “significant security incidents are projected to continue to grow in number, complexity and impact.” Thus in taking the calculated risk not to invest in security, companies and practices are creating greater risk for in the long run, one that comes with severe consequences.

As we have seen outside of healthcare, even “simple” breaches of user names and passwords as happened to Under Armour’s MyFitnessPal app, become relatively important use cases as examples of the impact a security breach can have. While healthcare companies typically think of this in terms of HIPAA compliance and oversight by the Office for Civil Rights (OCR), the consequences reach far wider.  Beyond the fines or even jail time that the OCR can impose, what these current breaches show us is how easy it is for the public to lose trust in an entity.  For a technology company, this means losing valuation which could signal a death knell for a startup. For a practice, this may mean losing patients.  For any entity, it will likely result in substantial legal fees.

Why take the risk not to invest in security? A company may think they are saving time and money up front and the likelihood of a breach or security incident is low. But in the long run, the risk is too great – no company wants to end up with their name splashed across the headlines, spending more money on legal fees, scrambling to notify those whose information has been breached, and rebuilding lost trust.  The short term gain of saving resources is not worth this risk.

The best thing a company or practice can do to get started is to run a detailed risk assessment. This is already required under HIPAA but is not always made a priority.  As the HIMSS report also discussed, there is no one standard for risk assessment and often the OCR is flexible knowing entities may be different sizes and have different resource. While encryption standards and network security should remain a high priority with constant monitoring, there are a few standard aspects of risk assessment including:

  • Identifying information (in either physical or electronic format) that may be at risk including where it is and whether the entity created, received, and/or is storing it;
  • Categorizing the risk of each type of information in terms of high, medium, or low risk and the impact a breach would have on this information;
  • Identifying who has access to the information;
  • Developing backup systems in case information is lost, unavailable, or stolen; and
  • Assessing incidence response plans.

Additionally, it is important to ensure proper training of all staff members on HIPAA policies and procedures including roles and responsibilities, which should be detailed and kept up to date in the office.

This is merely a start and should not be the end of the security measures companies and practices take to ensure they do not become the next use case. When discussing a recent $3.5 million settlement, OCR Director Roger Severino recently emphasized that, “there is no substitute for an enterprise-wide risk analysis for a covered entity.” Further, he stressed that “Covered entities must take a thorough look at their internal policies and procedures to ensure they are protecting their patients’ health information in accordance with the law.”

Though this may seem rudimentary, healthcare companies and medical practices are still not following simple steps to address security and are taking the calculated risk not to – which will likely be at their own peril.

About Erin Gilmer
Erin Gilmer is a health law and policy attorney and patient advocate. She writes about a range of issues on different forums including technology, disability, social justice, law, and social determinants of health. She can be found on twitter @GilmerHealthLaw or on her blog at www.healthasahumanright.wordpress.com.

Should Apps with Personal Health Information Be Subject to HIPAA?

Posted on April 10, 2018 I Written By

The following is a guest blog post by Erin Gilmer (@GilmerHealthLaw).

With news of Grindr’s sharing of user’s HIV status and location data, many wonder how such sensitive information could be so easily disclosed and the answer is quite simply a lack of strong privacy and security standards for apps.  The question then becomes whether apps that store personal health information should be subject to HIPAA? Should apps like Grindr have to comply with the Privacy and Security Rules as doctors, insurance companies, and other covered entities already do?

A lot of people already think this information is protected by HIPAA as they do not realize that HIPAA only applies to “covered entities” (health care providers, health plans, and health care clearininghouses) and “business associates” (companies that contract with covered entities).  Grindr is neither of these. Nor are most apps that address health issues – everything from apps with mental health tools to diet and exercise trackers. These apps can store all manner of information ranging simply from a name and birthdate to sensitive information including diagnoses and treatments.

Grindr is particularly striking because under HIPAA, there are extra protections for information including AIDS/HIV status, mental health diagnoses, genetics, and substance abuse history.  Normally, this information is highly protected and rightly so given the potential for discrimination. The privacy laws surrounding this information were hard fought by patients and advocates who often experienced discrimination themselves.

However, there is another reason this is particularly important in Grindr’s case and that’s the issue of public health.  Just a few days before it was revealed that the HIV status of users had been exposed, Grindr announced that it would push notifications through the app to remind users to get tested.  This was lauded as a positive move and added to the culture created on this app of openness. Already users disclose their HIV status, which is a benefit for public health and reducing the spread of the disease. However, if users think that this information will be shared without explicit consent, they may be less likely to disclose their status. Thus, not having privacy and security standards for apps with sensitive personal health information, means these companies can easily share this information and break the users’ trust, at the expense of public health.

Trust is one of the same reasons HIPAA itself exists.  When implemented correctly, the Privacy and Security Rules lend themselves to creating an environment of safety where individuals can disclose information that they may not want others to know.  This then allows for discussion of mental health issues, sexually transmitted diseases, substance use issues, and other difficult topics. The consequences of which both impact the treatment plan for the individual and greater population health.

It would be sensible to apply a framework like HIPAA to apps to ensure the privacy and security of user data, but certainly some would challenge the idea.  Some may make the excuse that is often already used in healthcare, that HIPAA stifles innovation undue burden on their industry and technology in general.  While untrue, this rhetoric holds sway with government entities who may oversee these companies.

To that end, there is a question of who would regulate such a framework? Would it fall to the Office for Civil Rights (OCR) where HIPAA regulation is already overseen? The OCR itself is overburdened, taking months to assess even the smallest of HIPAA complaints.  Would the FDA regulate compliance as they look to regulate more mobile apps that are tied to medical devices?  Would the FCC have a roll?  The question of who would regulate apps would be a fight in itself.

And finally, would this really increase privacy and security? HIPAA has been in effect for over two decades and yet still many covered entities fail to implement proper privacy and security protocols.  This does not necessarily mean there shouldn’t be attempts to address these serious issues, but some might question whether the HIPAA framework would be the best model.  Perhaps a new model, with new standards and consequences for noncompliance should be considered.

Regardless, it is time to start really addressing privacy and security of personal health information in apps. Last year, both Aetna and CVS Caremark violated patient privacy sending mail to patients where their HIV status could be seen through the envelope window. At present it seems these cases are under review with the OCR. But the OCR has been tough on these disclosures. In fact, in May 2017, St. Luke’s Roosevelt Hospital Center Inc. paid the OCR $387,200 in a settlement for a breach of privacy information including the HIV status of a patient. So the question is, if as a society, we recognize the serious nature of such disclosures, should we not look to prevent them in all settings – whether the information comes from a healthcare entity or an app?

With intense scrutiny of privacy and security in the media for all aspects of technology, increased regulation may be around the corner and the framework HIPAA creates may be worth applying to apps that contain personal health information.

About Erin Gilmer
Erin Gilmer is a health law and policy attorney and patient advocate. She writes about a range of issues on different forums including technology, disability, social justice, law, and social determinants of health. She can be found on twitter @GilmerHealthLaw or on her blog at www.healthasahumanright.wordpress.com.

Hands-On Guidance for Data Integration in Health: The CancerLinQ Story

Posted on June 15, 2017 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

Institutions throughout the health care field are talking about data sharing and integration. Everyone knows that improved care, cost controls, and expanded research requires institutions who hold patient data to safely share it. The American Society of Clinical Oncology’s CancerLinQ, one of the leading projects analyzing data analysis to find new cures, has tackled data sharing with a large number of health providers and discovered just how labor-intensive it is.

CancerLinQ fosters deep relationships and collaborations with the clinicians from whom it takes data. The platform turns around results from analyzing the data quickly and to give the clinicians insights they can put to immediate use to improve the care of cancer patients. Issues in collecting, storing, and transmitting data intertwine with other discussion items around cancer care. Currently, CancerLinQ isolates the data from each institution, and de-identifies patient information in order to let it be shared among participating clinicians. CancerLinQ LLC is a wholly-owned nonprofit subsidiary of ASCO, which has registered CancerLinQ as a trademark.

CancerLinQ logo

Help from Jitterbit

In 2015, CancerLinQ began collaborating with Jitterbit, a company devoted to integrating data from different sources. According to Michele Hazard, Director of Healthcare Solutions, and George Gallegos, CEO, their company can recognize data from 300 different sources, including electronic health records. At the beginning, the diversity and incompatibility of EHRs was a real barrier. It took them several months to figure out each of the first EHRs they tackled, but now they can integrate a new one quickly. Oncology care, the key data needed by CancerLinQ, is a Jitterbit specialty.

Jitterbit logo

One of the barriers raised by EHRs is licensing. The vendor has to “bless” direct access to EHR and data imported from external sources. HIPAA and licensing agreements also make tight security a priority.

Another challenge to processing data is to find records in different institutions and accurately match data for the correct patient.

Although the health care industry is moving toward the FHIR standard, and a few EHRs already expose data through FHIR, others have idiosyncratic formats and support older HL7 standards in different ways. Many don’t even have an API yet. In some cases, Jitterbit has to export the EHR data to a file, transfer it, and unpack it to discover the patient data.

Lack of structure

Jitterbit had become accustomed to looking in different databases to find patient information, even when EHRs claimed to support the same standard. One doctor may put key information under “diagnosis” while another enters it under “patient problems,” and doctors in the same practice may choose different locations.

Worse still, doctors often ignore the structured fields that were meant to hold important patient details and just dictate or type it into a free-text note. CancerLinQ anticipated this, unpacking the free text through optical character recognition (OCR) and natural language processing (NLP), a branch of artificial intelligence.

It’s understandable that a doctor would evade the use of structured fields. Just think of the position she is in, trying to keep a complex cancer case in mind while half a dozen other patients sit in the waiting room for their turn. In order to use the structured field dedicated to each item of information, she would have to first remember which field to use–and if she has privileges at several different institutions, that means keeping the different fields for each hospital in mind.

Then she has to get access to the right field, which may take several clicks and require movement through several screens. The exact information she wants to enter may or may not be available through a drop-down menu. The exact abbreviation or wording may differ from EHR to EHR as well. And to carry through a commitment to using structured fields, she would have to go through this thought process many times per patient. (CancerLinQ itself looks at 18 Quality eMeasures today, with the plan to release additional measures each year.)

Finally, what is the point of all this? Up until recently, the information would never come back in a useful form. To retrieve it, she would have to retrace the same steps she used to enter the structured data in the first place. Simpler to dump what she knows into a free-text note and move on.

It’s worth mentioning that this Babyl of health care information imposes negative impacts on the billing and reimbursement process, even though the EHRs were designed to support those very processes from the start. Insurers have to deal with the same unstructured data that CancerLinQ and Jitterbit have learned to read. The intensive manual process of extracting information adds to the cost of insurance, and ultimately the entire health care system. The recent eClinicalWorks scandal, which resembles Volkswagon’s cheating on auto emissions and will probably spill out to other EHR vendors as well, highlights the failings of health data.

Making data useful

The clue to unblocking this information logjam is deriving insights from data that clinicians can immediately see will improve their interventions with patients. This is what the CancerLinQ team has been doing. They run analytics that suggest what works for different categories of patients, then return the information to oncologists. The CancerLinQ platform also explains which items of data were input to these insights, and urges the doctors to be more disciplined about collecting and storing the data. This is a human-centered, labor-intensive process that can take six to twelve months to set up for each institution. Richard Ross, Chief Operating Officer of CancerLinQ calls the process “trench warfare,” not because its contentious but because it is slow and requires determination.

Of the 18 measures currently requested by CancerLinQ, one of the most critical data elements driving the calculation of multiple measures is staging information: where the cancerous tumors are and how far it has progressed. Family history, treatment plan, and treatment recommendations are other examples of measures gathered.

The data collection process has to start by determining how each practice defines a cancer patient. The CancerLinQ team builds this definition into its request for data. Sometimes they submit “pull” requests at regular intervals to the hospital or clinic, whereas other times the health care provider submits the data to them at a time of its choosing.

Some institutions enforce workflows more rigorously than others. So in some hospitals, CancerLinQ can persuade the doctors to record important information at a certain point during the patient’s visit. In other hospitals, doctors may enter data at times of their own choosing. But if they understand the value that comes from this data, they are more likely to make sure it gets entered, and that it conforms to standards. Many EHRs provide templates that make it easier to use structured fields properly.

When accepting information from each provider, the team goes through a series of steps and does a check-in with the provider at each step. The team evaluates the data in a different stage for each criterion: completeness, accuracy of coding, the number of patients reported, and so on. By providing quick feedback, they can help the practice improve its reporting.

The CancerLinQ/Jitterbit story reveals how difficult it is to apply analytics to health care data. Few organizations can afford the expertise they apply to extracting and curating patient data. On the other hand, CancerLinQ and Jitterbit show that effective data analysis can be done, even in the current messy conditions of electronic data storage. As the next wave of technology standards, such as FHIR, fall into place, more institutions should be able to carry out analytics that save lives.

Don’t Blame HIPAA: It Didn’t Require Orlando Regional Medical Center To Call the President

Posted on June 13, 2016 I Written By

The following is a guest blog post by Mike Semel, President of Semel Consulting. As a Healthcare Scene community, our hearts go out to all the victims of this tragedy.

Orlando Mayor Buddy Dyer said the influx of patients to the hospitals created problems due to confidentiality regulations, which he worked to have waived for victims’ families.

“The CEO of the hospital came to me and said they had an issue related to the families who came to the emergency room. Because of HIPAA regulations, they could not give them any information,” Dyer said. “So I reached out to the White House to see if we could get the HIPAA regulations waived. The White House went through the appropriate channels to waive those so the hospital could communicate with the families who were there.”    Source: WBTV.com

I applaud the Orlando Regional Medical Center for its efforts to help the shooting victims. As the region’s trauma center, I think it could have done a lot better by not letting HIPAA get in the way of communicating with the patients’ families and friends.

In the wake of the horrific nightclub shooting, the hospital made things worse for the victim’s families and friends. And it wasn’t necessary, because built into HIPAA is a hospital’s ability to share information without calling the President of the United States. There are other exemptions for communicating with law enforcement.

The Orlando hospital made this situation worse for the families when its Mass Casualty Incident (MCI) plan should have anticipated the situation. A trauma center should have been better prepared than to ask the mayor for help.

As usual, HIPAA got the blame for someone’s lack of understanding about HIPAA. Based on my experience, many executives think they are too busy, or think themselves too important, to learn about HIPAA’s fundamental civil rights for patients. Civil Rights? HIPAA is enforced by the US Department of Health & Human Services’ Office for Civil Rights.

HIPAA compliance and data security are both executive level responsibilities, although many executives think it is something that should get tasked out to a subordinate. Having to call the White House because the hospital didn’t understand that HIPAA already gave it the right to talk to the families is shameful. It added unnecessary delays and more stress to the distraught families.

Doctors are often just as guilty as hospital executives of not taking HIPAA training and then giving HIPAA a bad rap. (I can imagine the medical practice managers and compliance officers silently nodding their heads.)

“HIPAA interferes with patient care” is something I hear often from doctors. When I ask how, I am told by the doctors that they can’t communicate with specialists, call for a consult, or talk to their patients’ families. These are ALL WRONG.

I ask those doctors two questions that are usually met with a silent stare:

  1. When was the last time you received HIPAA training?
  2. If you did get trained, did it take more than 5 minutes or was it just to get the requirement out of the way?

HIPAA allows doctors to share patient information with other doctors, hospitals, pharmacies, and Business Associates as long as it is for a patient’s Treatment, Payment, and for healthcare Operations (TPO.) This is communicated to patients through a Notice of Privacy Practices.

HIPAA allows doctors to use their judgment to determine what to say to friends and families of patients who are incapacitated or incompetent. The Orlando hospital could have communicated with family members and friends.

From Frequently Asked Questions at the HHS website:

Does the HIPAA Privacy Rule permit a hospital to inform callers or visitors of a patient’s location and general condition in the emergency room, even if the patient’s information would not normally be included in the main hospital directory of admitted patients?

Answer: Yes.

If a patient’s family member, friend, or other person involved in the patient’s care or payment for care calls a health care provider to ask about the patient’s condition, does HIPAA require the health care provider to obtain proof of who the person is before speaking with them?

Answer: No.  If the caller states that he or she is a family member or friend of the patient, or is involved in the patient’s care or payment for care, then HIPAA doesn’t require proof of identity in this case.  However, a health care provider may establish his or her own rules for verifying who is on the phone.  In addition, when someone other than a friend or family member is involved, the health care provider must be reasonably sure that the patient asked the person to be involved in his or her care or payment for care.

Can the fact that a patient has been “treated and released,” or that a patient has died, be released as part of the facility directory?

Answer: Yes.

Does the HIPAA Privacy Rule permit a doctor to discuss a patient’s health status, treatment, or payment arrangements with the patient’s family and friends?

Answer: Yes. The HIPAA Privacy Rule at 45 CFR 164.510(b) specifically permits covered entities to share information that is directly relevant to the involvement of a spouse, family members, friends, or other persons identified by a patient, in the patient’s care or payment for health care. If the patient is present, or is otherwise available prior to the disclosure, and has the capacity to make health care decisions, the covered entity may discuss this information with the family and these other persons if the patient agrees or, when given the opportunity, does not object. The covered entity may also share relevant information with the family and these other persons if it can reasonably infer, based on professional judgment, that the patient does not object. Under these circumstances, for example:

  • A doctor may give information about a patient’s mobility limitations to a friend driving the patient home from the hospital.
  • A hospital may discuss a patient’s payment options with her adult daughter.
  • A doctor may instruct a patient’s roommate about proper medicine dosage when she comes to pick up her friend from the hospital.
  • A physician may discuss a patient’s treatment with the patient in the presence of a friend when the patient brings the friend to a medical appointment and asks if the friend can come into the treatment room.

Even when the patient is not present or it is impracticable because of emergency circumstances or the patient’s incapacity for the covered entity to ask the patient about discussing her care or payment with a family member or other person, a covered entity may share this information with the person when, in exercising professional judgment, it determines that doing so would be in the best interest of the patient. See 45 CFR 164.510(b).

Thus, for example:

  • A surgeon may, if consistent with such professional judgment, inform a patient’s spouse, who accompanied her husband to the emergency room, that the patient has suffered a heart attack and provide periodic updates on the patient’s progress and prognosis.
  • A doctor may, if consistent with such professional judgment, discuss an incapacitated patient’s condition with a family member over the phone.
  • In addition, the Privacy Rule expressly permits a covered entity to use professional judgment and experience with common practice to make reasonable inferences about the patient’s best interests in allowing another person to act on behalf of the patient to pick up a filled prescription, medical supplies, X-rays, or other similar forms of protected health information. For example, when a person comes to a pharmacy requesting to pick up a prescription on behalf of an individual he identifies by name, a pharmacist, based on professional judgment and experience with common practice, may allow the person to do so.

Other examples of hospital executives’ lack of HIPAA knowledge include:

  • Shasta Regional Medical Center, where the CEO and Chief Medical Officer took a patient’s chart to the local newspaper and shared details of her treatment without her permission.
  • NY Presbyterian Hospital, which allowed the film crew from ABC’s ‘NY Med’ TV show to film dying and incapacitated patients.

To healthcare executives and doctors, many of your imagined challenges caused by HIPAA can be eliminated by learning more about the rules. You need to be prepared for the 3 a.m. phone call. And you don’t have to call the White House for help.

About Mike Semel
Mike Semel, President of Semel Consulting,  is a certified HIPAA expert with over 12 years’ HIPAA experience and 30 years in IT. He has been the CIO for a hospital and a K-12 school district; owned and managed IT companies; ran operations at an online backup provider; and is a recognized HIPAA expert and speaker. He can be reached at mike@semelconsulting.com or 888-997-3635 x 101.

Phase 2 HIPAA Audits Kick Off With Random Surveys

Posted on June 9, 2015 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Ideally, the only reason you would know about the following is due to scribes such as myself — but for the record, the HHS Office for Civil Rights has sent out a bunch of pre-audit screening surveys to covered entities. Once it gets responses, it will do a Phase 2 audit not only of covered entities but also business associates, so things should get heated.

While these take the form of Meaningful Use audits, covering incentives paid from January 1, 2011 through June 30, 2014, it’s really more about checking how well you protect ePHI.

This effort is a drive to be sure that providers and BAs are complying with the HIPAA privacy, security and breach notification requirements. Apparently OCR found, during Phase 1 pilot audits in 2011 and 2012, that there was “pervasive non-compliance” with regs designed to safeguard protected health information, the National Law Review reports.

However, these audits aren’t targeting the “bad guys.” Selection for the audits is random, according to HHS Office of the Inspector General.

So if you get one of the dreaded pre-screening letters, how should you respond? According a thoughtful blog post by Maryanne Lambert for CureMD, auditors will be focused on the following areas:

  • Risk Assessment audits and reports
  • EHR security plan
  • Organizational chart
  • Network diagram
  • EHR web sites and patient portals
  • Policies and procedures
  • System inventory
  • Tools to perform vulnerability scans
  • Central log and event reports
  • EHR system users list
  • Contractors supporting the EHR and network perimeter devices.

According to Lambert, the feds will want to talk to the person primarily responsible for each of these areas, a process which could quickly devolve into a disaster if those people aren’t prepared. She recommends that if you’re selected for an audit, you run through a mock audit ahead of time to make sure these staff members can answer questions about how well policies and processed are followed.

Not that anyone would take the presence of HHS on their premises lightly, but it’s worth bearing in mind that a stumble in one corner of your operation could have widespread consequences. Lambert notes that in addition to defending your security precautions, you have to make sure that all parts of your organization are in line:

Be mindful while planning for this audit as deficiencies identified for one physician in a physician group or one hospital within a multi-hospital system, may apply to the other physicians and hospitals using the same EHR system and/or implementing meaningful use in the same way.  Thus, the incentive payments at risk in this audit may be greater than the payments to the particular provider being audited.

But as she points out, there is one possible benefit to being audited. If you prepare well, it might save you not only trouble with HHS but possibly lawsuits for breaches of information. Hey, everything has some kind of silver lining, right?

Beware: Don’t Buy In to Myths about Data Security and HIPAA Compliance

Posted on January 22, 2015 I Written By

The following is a guest blog post by Mark Fulford, Partner in LBMC’s Security & Risk Services practice group.
Mark Fulford
Myths abound when it comes to data security and compliance. This is not surprising—HIPAA covers a lot of ground and many organizations are left to decide on their own how to best implement a compliant data security solution. A critical first step in putting a compliant data security solution in place is separating fact from fiction.  Here are four common misassumptions you’ll want to be aware of:

Myth #1: If we’ve never had a data security incident before, we must be doing OK on compliance with the HIPAA Security Rule.

It’s easy to fall into this trap. Not having had an incident is a good start, but HIPAA requires you to take a more proactive stance. Too often, no one is dedicated to monitoring electronic protected health information (ePHI) as prescribed by HIPAA. Data must be monitored—that is, someone must be actively reviewing data records and security logs to be on the lookout for suspicious activity.

Your current IT framework most likely includes a firewall and antivirus/antimalware software, and all systems have event logs. These tools collect data that too often go unchecked. Simply assigning someone to review the data you already have will greatly improve your compliance with HIPAA monitoring requirements, and more importantly, you may discover events and incidents that require your attention.

Going beyond your technology infrastructure, your facility security, hardcopy processing, workstation locations, portable media, mobile device usage and business associate agreements all need to be assessed to make sure they are compliant with HIPAA privacy and security regulations. And don’t forget about your employees. HIPAA dictates that your staff is trained (with regularly scheduled reminders) on how to handle PHI appropriately.

Myth #2: Implementing a HIPAA security compliance solution will involve a big technology spend.

This is not necessarily the case.  An organization’s investment in data security solutions can vary, widely depending on its size, budget and the nature of its transactions. The Office for Civil Rights (OCR) takes these variables into account—certainly, a private practice will have fewer resources to divert to security compliance than a major corporation. As long as you’ve justified each decision you’ve made about your own approach to compliance with each of the standards, the OCR will take your position into account if you are audited.

Most likely, you already have a number of appropriate technical security tools in place necessary to meet compliance. The added expense will more likely be associated with administering your data security compliance strategy.

Myth #3: We’ve read the HIPAA guidelines and we’ve put a compliance strategy in place. We must be OK on compliance.

Perhaps your organization is following the letter of the law. Policies and procedures are in place, and your staff is well-trained on how to handle patient data appropriately. By all appearances, you are making a good faith effort to be compliant.

But a large part of HIPAA compliance addresses how the confidentiality, integrity, and availability of ePHI is monitored in the IT department. If no one on the team has been assigned to monitor transactions and flag anomalies, all of your hard work at the front of the office could be for naught.

While a ‘check the box’ approach to HIPAA compliance might help if you get audited, unless it includes the ongoing monitoring of your system, your patient data may actually be exposed.

Myth #4: The OCR won’t waste their time auditing the ‘little guys.’ After all, doesn’t the agency have bigger fish to fry?

This is simply not true. Healthcare organizations of all sizes are eligible for an audit. Consider this cautionary tale: as a result of a reported incident, a dermatologist in Massachusetts was slapped with a $150,000 fine when an employee’s thumb drive was stolen from a car.

Fines for non-compliance can be steep, regardless of an organization’s size. If you haven’t done so already, now might be a good time to conduct a risk assessment and make appropriate adjustments. The OCR won’t grant you concessions just because you’re small, but they will take into consideration a good faith effort to comply.

Data Security and HIPAA Compliance: Make No Assumptions

As a provider, you are probably aware that the audits are starting soon, but perhaps you aren’t quite sure what that means for you. Arm yourself with facts. Consult with outside sources if necessary, but be aware that the OCR is setting the bar higher for healthcare organizations of all sizes. You might want to consider doing this, too. Your business—and your patients—are counting on it.

About Mark Fulford
Mark Fulford is a Partner in LBMC’s Security & Risk Services practice group.  He has over 20 years of experience in information systems management, IT auditing, and security.  Marks focuses on risk assessments and information systems auditing engagements including SOC reporting in the healthcare sector.  He is a Certified Information Systems Auditor (CISA) and Certified Information Systems Security Professional (CISSP).   LBMC is a top 50 Accounting & Consulting firm based in Brentwood, Tennessee.