Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

Amazing Live Visualization of Internet Attacks

Posted on October 22, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 15 blogs containing almost 6000 articles with John having written over 3000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 13 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I recently heard Elliot Lewis, Dell’s Chief Security Architect, comment that “The average new viruses per day is about 5-10k appearing new each day.” To be honest, I wasn’t quite sure how to process that type of volume of viruses. It felt pretty unbelievable to me even though, I figured he was right.

Today, I came across this amazing internet attack map by Norse which illustrates a small portion of the attacks that are happening on the internet in real time. I captured a screenshot of the map below, but you really need to check out the live map to get a feel for how many internet attacks are happening. It’s astounding to watch.

Norse - Internet Attack Map

For those tech nerds out there, here’s the technical description of what’s happening on the map:

Every second, Norse collects and analyzes live threat intelligence from darknets in hundreds of locations in over 40 countries. The attacks shown are based on a small subset of live flows against the Norse honeypot infrastructure, representing actual worldwide cyber attacks by bad actors. At a glance, one can see which countries are aggressors or targets at the moment, using which type of attacks (services-ports).

It’s worth noting that these are the attacks that are happening. Just because something is getting attacked doesn’t mean that the attack was successful. A large majority of the attacks aren’t successful. However, when you see the volume of attacks (and that map only shows a small portion of them) is so large, you only need a small number of them to be successful to wreak a lot of havoc.

If this type of visualization doesn’t make you stop and worry just a little bit, then you’re not human. There’s a lot of crazy stuff going on out there. It’s actually quite amazing that with all the crazy stuff that’s happening, the internet works as well as it does.

Hopefully this visualization will wake up a few healthcare organizations to be just a little more serious about their IT security.

How Secure Are Wearables?

Posted on October 1, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 15 blogs containing almost 6000 articles with John having written over 3000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 13 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

JaneenB asks a really fantastic question in this tweet. Making sure that wearables are secure is going to be a really hot topic. Yesterday, I was talking with Mac McMillan from Cynergistek and he suggested that the FDA was ready to make medical device security a priority. I’ll be interested to see what the FDA does to try and regulate security in medical devices, but you can see why this is an important thing. Mac also commented that while it’s incredibly damaging for someone to hack a pacemaker like the one Vice President Cheney had (has?), the bigger threat is the 300 pumps that are installed in a hospital. If one of them can be hacked, they all can be hacked and the process for updating them is not simple.

Of course, Mac was talking about medical device security from more of an enterprise perspective. Now, let’s think about this across millions of wearable devices that are used by consumers. Plus, many of these consumer wearable devices don’t require FDA clearance and so the FDA won’t be able to impose more security restrictions on them.

I’m not really sure the answer to this problem of wearable security. Although, I think two steps in the right direction could be for health wearable companies to first build a culture of security into their company and their product. This will add a little bit of expense on the front end, but it will more than pay off on the back end when they avoid security issues which could literally leave the company in financial ruins. Second, we could use some organization to take on the effort of reporting on the security (or lack thereof) of these devices. I’m not sure if this is a consumer reports type organization or a media company. However, I think the idea of someone holding organizations accountable is important.

We’re definitely heading towards a world of many connected devices. I don’t think we have a clear picture of what this means from a security perspective.

Complete Health IT Security is a Myth, But That Doesn’t Mean We Shouldn’t Try

Posted on August 11, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 15 blogs containing almost 6000 articles with John having written over 3000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 13 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

As I mentioned, last week I had the opportunity to attend the Black Hat conference in Las Vegas. There were over 9000 attendees and 180+ speakers sharing on the latest and greatest IT security and privacy topics. Black Hat is more appropriately called a hackers conference (although Defcon is more hardcore hacker than Black Hat which had plenty of corporate prensence) for good reason. You turn off your devices and be careful what you do. There’s a certain paranoia that comes when one of the vendor handouts is a foil credit card cover that prevents someone from stealing your credit card number. I didn’t quite have my tin foil hat on, but you could start to understand the sentiment.

One of the most interesting things about Black Hat is to get an idea of the mentality of the hacker. Their creative process is fascinating. Their ability to work around obstacles is something we should all learn to incorporate into our lives. I think for most of these hackers, there’s never a mentality of something can’t be done. It’s just a question of figuring out a way to work around whatever obstacles are in their way. We could use a little more of this mentality in dealing with the challenges of healthcare.

The biggest thing I was reminded of at the event was that complete security and privacy is a myth. If someone wants to get into something badly enough, they’ll find a way. As one security expert I met told me, the only secure system is one that’s turned off, not connected to anything, and buried underground. If a computer or device is turned on, then it’s vulnerable.

The reality is that complete security shouldn’t be our goal. Our goal should be to make our systems secure enough that it’s not worth someone’s time or effort to break through the security. I can assure you that most of healthcare is far from this level of security. What a tremendous opportunity that this presents.

The first place to start in any organization is to create a culture of security and privacy. The one off efforts that most organization apply after a breach or an audit aren’t going to get us there. Instead, you have to incorporate a thoughtful approach to security into everything you do. This starts at the RFP continues through the procurement process extends into the implementation and continues on through the maintenance of the product.

Security and privacy efforts in an organization are hard to justify since they don’t increase the bottom line. This is another reason why the efforts need to be integrated into everything that’s done and not just tied to a specific budget line item. As a budget line item, it’s too easy to cut it out when budgets get tight. The good news is that a little effort throughout the process can avoid a lot of heartache later on. Ask an organization that’s had a breach or failed an audit.

Unfinished Business: More HIPAA Guidelines to Come

Posted on August 4, 2014 I Written By

The following is a guest blog post by Rita Bowen, Sr. Vice President of HIM and Privacy Officer at HealthPort.

After all of the hullabaloo since the U.S. Department of Health and Human Services (HHS) Office for Civil Rights (OCR) release of the HIPAA Omnibus, it’s humbling to realize that the work is not complete. While the Omnibus covered a lot of territory in providing new guidelines for the privacy and security of electronic health records, the Final Rule failed to address three key pieces of legislation that are of great relevance to healthcare providers.

The three areas include the “minimum necessary” standard; whistleblower compensation; and revised parameters for electronic health information (EHI) access logs. No specific timetable has been provided for the release of revised legislation.

Minimum Necessary

The minimum necessary standard requires providers to “take reasonable steps to limit the use or disclosure of, and requests for, protected health information to the minimum necessary to accomplish the intended purpose.”

This requires that the intent of the request and the review of the health information be matched to assure that only the minimum information intended for the authorized release be provided. To date, HHS has conducted a variety of evaluations and is in the process of assessing that data.

Whistleblower Compensation

The second bit of unfinished legislation is a proposed rule being considered by HHS that would dramatically increase the payment to Medicare fraud whistleblowers. If adopted, the program, called the Medicare Incentive Reward Program (IRP), will raise payments from a current maximum of $1,000 to nearly $10 million.

I believe that the added incentive will create heightened sensitivity to fraud and that more individuals will be motivated to act. People are cognizant of fraudulent situations but they have lacked the incentive to report, unless they are deeply disgruntled.

Per the proposed plan, reports of fraud can be made by simply making a phone call to the correct reporting agency which should facilitate whistleblowing.

Access Logs

The third, and most contentious, area of concern is with EHI access logs. The proposed legislation calls for a single log to be created and provided to the patient, that would contain all instances of access to the patient’s EHI, no matter the system or situation.

From a patient perspective, the log would be unwieldy, cumbersome and extremely difficult to decipher for the patient’s needs. An even more worrisome aspect is that of the privacy of healthcare workers.

Employees sense that their own privacy would be invaded if regulations require that their information, including their names and other personal identifiers, are shared as part of the accessed record.  Many healthcare workers have raised concern regarding their own safety if this information is openly made available. This topic has received a tremendous amount of attention.

In discussion are alternate plans that would negotiate the content of access logs, tailoring them to contain appropriate data regarding the person in question by the patient while still satisfying patients and protecting the privacy of providers.

The Value of Data Governance

Most of my conversations circle back to the value of information (or data) governance. This situation of unfinished EHI design and management is no different. Once released the new legislation for the “minimum necessary” standard, whistleblower compensation and revised parameters for medical access logs must be woven into your existing information governance plan.

Information governance is authority and control—the planning, monitoring and enforcement—of your data assets, which could be compromised if all of the dots are not connected. Organizations should be using this time to build the appropriate foundation to their EHI.

About the Author:
Rita Bowen, MA, RHIA, CHPS, SSGB

Ms. Bowen is a distinguished professional with 20+ years of experience in the health information management industry.  She serves as the Sr. Vice President of HIM and Privacy Officer of HealthPort where she is responsible for acting as an internal customer advocate.  Most recently, Ms. Bowen served as the Enterprise Director of HIM Services for Erlanger Health System for 13 years, where she received commendation from the hospital county authority for outstanding leadership.  Ms. Bowen is the recipient of Mentor FORE Triumph Award and Distinguished Member of AHIMA’s Quality Management Section.  She has served as the AHIMA President and Board Chair in 2010, a member of AHIMA’s Board of Directors (2006-2011), the Council on Certification (2003-2005) and various task groups including CHP exam and AHIMA’s liaison to HIMSS for the CHS exam construction (2002).

Ms. Bowen is an established speaker on diverse HIM topics and an active author on privacy and legal health records.  She served on the CCHIT security and reliability workgroup and as Chair of Regional Committees East-Tennessee HIMSS and co-chair of Tennessee’s e-HIM group.  She is an adjunct faculty member of the Chattanooga State HIM program and UT Memphis HIM Master’s program.  She also serves on the advisory board for Care Communications based in Chicago, Illinois.

Criminals Have Their Eyes on Your Patients’ Records

Posted on June 26, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 15 blogs containing almost 6000 articles with John having written over 3000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 13 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

The following is a guest blog post by Art Gross, Founder of HIPAA Secure Now!
Art Gross Headshot
It’s one thing to have a laptop stolen with 8,000 patient records or for a disgruntled doctor to grab his patients’ records and start his own practice.  It’s another when the Cosa Nostra steals that information, siphons money from the patient’s bank account and turns it into a patient trafficking crime ring.  Welcome to organized crime in the age of big data.

Organized crime syndicates and gangs targeting medical practices and stealing patient information are on the rise. They’re grabbing patient names, addresses, insurance details, social security numbers, birth dates, etc., and using it to steal patients’ identities and their assets.

It’s not uncommon for the girlfriend of a gang member to infiltrate a medical practice or hospital, gain access to electronic health records, download patient information and hand it over to the offender who uses it to file false tax returns. In fact gang members often rent a hotel room and file the returns together, netting $40,000-$50,000 in one night!

Florida is hotbed for this activity and it’s spreading across the country.  In California, narcotics investigators took down a methamphetamine ring and confiscated patient information on 4,500 patients. Investigators believe the stolen information was being used to obtain prescription drugs to make the illicit drug.

Value of patient records

Stolen patient information comes with a high price tag if the medical practice is fined by HIPAA. One lost or stolen patient record is estimated at $50, compared to the price of a credit card record which fetches a dollar.  Patient records are highly lucrative. The below charts shows the value of patient information that might be sitting in an EHR system:

Amount of Patient Records Value of Patient Records
1,000 $50,000
5,000 $250,000
10,000 $500,000
100,000 $5,000,000

 
Protect your practice

Medical practices need to realize they are vulnerable to patient record theft and should take steps to reduce their risk by implementing additional security.  Here are seven steps that organizations can take to protect electronic patient information:

  1. Perform a security risk assessment – a security risk assessment is not only required for HIPAA Compliance and EHR Meaningful Use but it can identify security risks that may allow criminals to steal patient information.
  2. Screen job applicants – all job applicants should be properly screened prior to hiring and providing access to patient information. Look for criminal records, frequent job switches or anything else that might be a warning sign.
  3. Limit access to patient information – employees should have minimal access necessary to perform their jobs rather than full access to electronic health records.
  4. Audit access to patient information – every employee should use their own user ID and password; login information should not be shared. And access to patient information should be recorded, including who accessed, when, and which records they accessed.
  5. Review audit logs – organizations must keep an eye on audit logs. Criminal activity can be happening during a normal business day. Reviewing audit logs can uncover strange or unexpected activity. Let’s say an employee accesses, on average 10 patient records per day and on one particular day they retrieve 50 to 100 records.  Or records are being accessed after business hours. Both activities could be a sign of criminal activity. The key is to review audit logs regularly and look for unusual access.
  6. Security training – all employees should receive security training on how to protect patient information, and make sure they know any patient information activity is being logged and reviewed.  Knowing that employee actions are being observed should dissuade them from using patient information illegally.
  7. Limit the use of USB drives – in the past it would take a truck to steal 10,000 patient charts. Now they can easily be copied onto a small thumb/USB drive and slipped into a  doctor’s lab coat.  Organizations should limit the use of USB drives to prevent illegal activity.

The high resale value of patient information and the ability to use it to file false tax returns or acquire illegal prescriptions make it a prime target for criminals.  Medical practices need to recognize the risk and put proper IT security measures in place to keep their patient information from “securing” hefty tax refunds

About Art Gross

Art Gross co-founded Entegration, Inc. in 2000 and serves as President and CEO. As Entegration’s medical clients adopted EHR technology Gross recognized the need to help them protect patient data and comply with complex HIPAA security regulations. Leveraging his experience supporting medical practices, in-depth knowledge of HIPAA compliance and security, and IT technology, Gross started his second company HIPAA Secure Now! to focus on the unique IT requirements of medical practices.  Email Art at artg@hipaasecurenow.com.

Another View of Privacy by Dr. Deborah C. Peel, MD

Posted on June 25, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 15 blogs containing almost 6000 articles with John having written over 3000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 13 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I thought the following TEDx video from Deborah C. Peel, MD, Founder and Chair of Patient Privacy Rights, would be an interesting contrast with some of the things that Andy Oram wrote in yesterday’s post titled “Not So Open: Redefining Goals for Sharing Health Data in Research“. Dr. Peel is incredibly passionate about protecting patient’s privacy and is working hard on that goal.

Dr. Peel is also trying to kick off a hashtag called #MyHealthDataIsMine. What do you think of the “hidden privacy and data breaches” that Dr. Peel talks about in the video? I look forward to hearing your thoughts on it.

Not So Open: Redefining Goals for Sharing Health Data in Research

Posted on June 24, 2014 I Written By

The following is a guest blog post by Andy Oram, writer and editor at O’Reilly Media.

One couldn’t come away with more enthusiasm for open data than at this month’s Health Datapalooza, the largest conference focused on using data in health care. The whole 2000-strong conference unfolds from the simple concept that releasing data publicly can lead to wonderful things, like discovering new cancer drugs or intervening with patients before they have to go to the emergency room.

But look more closely at the health care field, and open data is far from the norm. The demonstrated benefits of open data sets in other fields–they permit innovation from any corner and are easy to combine or “mash up” to uncover new relationships–may turn into risks in health care. There may be better ways to share data.

Let’s momentarily leave the heady atmosphere of the Datapalooza and take a subway a few stops downtown to the Health Privacy Summit, where fine points of patient consent, deidentification, and the data map of health information exchange were discussed the following day. Participants here agree that highly sensitive information is traveling far and wide for marketing purposes, and perhaps even for more nefarious uses to uncover patient secrets and discriminate against them.

In addition to outright breaches–which seem to be reported at least once a week now, and can involve thousands of patients in one fell swoop–data is shared in many ways that arguably should be up to patients to decide. It flows from hospitals, doctors, and pharmacies to health information exchanges, researchers in both academia and business, marketers, and others.

Debate has raged for years between those who trust deidentification and those who claim that reidentification is too easy. This is not an arcane technicality–the whole industry of analytics represented at the Datapalooza rests on the result. Those who defend deidentification tend to be researchers in health care and the institutions who use their results. In contrast, many computer scientists outside the health care field cite instances where people have been reidentified, usually by combining data from various public sources.

Latanya Sweeney of Harvard and MIT, who won a privacy award this year at the summit, can be credited both with a historic reidentification of the records of Massachusetts Governor William Weld in 1997 and a more recent exposé of state practices. The first research led to the current HIPAA regime for deidentification, while the second showed that states had not learned the lessons of anonymization. No successful reidentifications have been reported against data sets that use recommended deidentification techniques.

I am somewhat perplexed by the disagreement, but have concluded that it cannot be resolved on technical grounds. Those who look at the current state of reidentification are satisfied that health data can be secured. Those who look toward an unspecified future with improved algorithms find reasons to worry. In a summit lunchtime keynote, Adam Tanner reported his own efforts as a non-expert to identify people online–a fascinating and sometimes amusing tale he has written up in a new book, What Stays in Vegas. So deidentification is like encryption–we all use encryption even though we expect that future computers will be able to break current techniques.

But another approach has flown up from the ashes of the “privacy is dead” nay-sayers: regulating the use of data instead of its collection and dissemination. This has been around for years, most recently in a federal PCAST report on big data privacy. One of the authors of that report, Craig Mundie of Microsoft, also published an article with that argument in the March/April issue of Foreign Affairs.

A simple application of this doctrine in health care is the Genetic Information Nondiscrimination Act of 2008. A more nuanced interpretation of the doctrine could let each individual determine who gets to use his or her data, and for what purpose.

Several proposals have been aired to make it easier for patients to grant blanket permission for certain data uses, one proposal being “patient privacy bundles” in a recent report commissioned by AHRQ. Many people look forward to economies of data, where patients can make money by selling data (how much is my blood pressure reading worth to you)?

Medyear treats personal health data like Twitter feeds, letting you control the dissemination of individual data fields through hash tags. You could choose to share certain data with your family, some with your professional care team, and some with members of your patient advocacy network. This offers an alternative to using services such as PatientsLikeMe, which use participants’ data behind the scenes.

Open data can be simulated by semi-open data sets that researchers can use under license, as with the Genetic Association Information Network that controls the Database of Genotypes and Phenotypes (dbGaP). Many CMS data sets are actually not totally open, but require a license to use.

And many data owners create relationships with third-party developers that allow them access to data. Thus, the More Disruption Please program run by athenahealth allows third-party developers to write apps accessing patient data through an API, once the developers sign a nondisclosure agreement and a Code of Conduct promising to use the data for legitimate purposes and respect privacy. These apps can then be offered to athenahealth’s clinician clients to extend the system’s capabilities.

Some speakers went even farther at the Datapalooza, asking whether raw data needs to be shared at all. Adriana Lukas of London Quantified Self and Stephen Friend of Sage Bionetworks suggested that patients hold on to all their data and share just “meanings” or “methods” they’ve found useful. The future of health analytics, it seems to me, will use relatively few open data sets, and lots of data obtained through patient consent or under license.

HIPAA Security and Audits with Mac McMillan

Posted on May 20, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 15 blogs containing almost 6000 articles with John having written over 3000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 13 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

In case you missed the recent HIPAA Privacy and Security hangout I did with Mac McMillan, CEO of Cynergistek, you’re missing out. I think this HIPAA interview is an extension of what we started in our post “6 Reality Checks of HIPAA Compliance.” There’s a real awakening that’s needed when it comes to HIPAA. I love in this hangout when Mac says that the patience in Washington for those that aren’t HIPAA compliant is running low. An example of that is another topic we discus: HIPAA audits. The first round of HIPAA audits were more of a barometer of what was happening. The next round we’ll likely be much more damaging.

Watch the entire HIPAA interview with Mac McMillan to learn even more:

Healthcare Risks, Privacy Risks, and Blowing Up MU

Posted on May 18, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 15 blogs containing almost 6000 articles with John having written over 3000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 13 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.


All of healthcare has risks. The key is getting a good grasp of all the risks. Are we doing that really well in healthcare IT and EHR?


I repeatedly find that most people are happy to give up some privacy risk for the potential for better health. This increases even more when someone is seriously sick. Privacy becomes even less important to them.


I always love to see tweets from someone I’ve never met or heard of tweeting out my articles. Tim did a good job summarizing my post about blowing up meaningful use. The post has gotten some good traction and a great discussion. I’m sure that they won’t take my exact approach, but I hope that it will help push ONC to move MU in a direction of extreme simplification.

Where Are the Big Business Associate HIPAA Breaches?

Posted on April 29, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 15 blogs containing almost 6000 articles with John having written over 3000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 13 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

It seems like I have HIPAA and security on my mind lately. It started with me writing about the 6 HIPAA Compliance Reality Checks whitepaper and then carried over with my piece looking at whether cloud adoption addresses security and privacy concerns. In the later post, there’s been a really rich discussion around the ability of an enterprise organization to be able to secure their systems better than most healthcare organizations.

As part of that discussion I started thinking about the HHS HIPAA Wall of Shame. Off hand, I couldn’t think of any incidents where a business associate (ie. a healthcare cloud provider) was ever posted on the wall or any reports of major HIPAA breaches by a large business associate. Do you know of some that I’ve just missed?

When I looked at the HIPAA Wall of Shame, there wasn’t even a covered entity type for business associates. I guess they’re not technically a covered entity even though they act like one now thanks to HIPAA Omnibus. Maybe that’s why we haven’t heard of any and we don’t see any listed? However, there is a filter on the HIPAA Breach disclosure page that says “Business Associate Present?” If you use that filter, 277 of the breaches had a “business associate present.” Compare that with the 982 breaches they have posted since they started in late 2009.

I took a minute to dig into some of the other numbers. Since they started in 2009, they’ve reported breaches that affected 31,319,872 lives. My rough estimate for 2013 (which doesn’t include some breaches that occurred over a period of time) is 7.25 million lives affected. So far in 2014 they’ve posted HIPAA breaches with 478,603 lives affected.

Certainly HIPAA omnibus only went into effect late last year. However, I wonder if HHS plans to expand the HIPAA Wall of Shame to include breaches by business associates. You know that they’re already happening or that they’re going to happen. Although, not as often if you believe my previous piece on them being more secure.

As I considered why we don’t know of other HIPAA business associate breaches, I wondered why else we might not have heard more. I think it’s naive to think that none of them have had issues. Statistics alone tells us otherwise. I do wonder if there is just not a culture of following HIPAA guidelines so we don’t hear about them?

Many healthcare business associates don’t do much more than pay lip service to HIPAA. Many don’t realize that under the new HIPAA omnibus they’re going to be held accountable similar to a covered entity. If they don’t know those basic things, then can we expect them to disclose when there’s been a HIPAA breach? In healthcare organizations they now have that culture of disclosure. I’m not sure the same can be said for business associates.

Then again, maybe I’m wrong and business associates are just so much better at HIPAA compliance, security and privacy, that there haven’t been any major breaches to disclose. If that’s the case, it won’t last forever.