Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

There’s More to HIPAA Compliance Than Encryption

Posted on March 24, 2015 I Written By

The following is a guest blog post by Asaf Cidon, CEO and Co-Founder of Sookasa.
Asaf Cidon
The news that home care provider Amedisys had a HIPAA breach involving more than 100 lost laptops—even though they contained encrypted PHI—might have served as a wake-up call to many healthcare providers.  Most know by now that they need to encrypt their files to comply with HIPAA and prevent a breach. While it’s heartening to see increased focus on encryption, it’s not enough to simply encrypt data. To ensure compliance and real security, it’s critical to also manage and monitor access to protected health information.

Here’s what you should look for from any cloud-based solution to help you remain compliant.

  1. Centralized, administrative dashboard: The underlying goal of HIPAA compliance is to ensure that ­­organizations have meaningful control over their sensitive information. In that sense, a centralized dashboard is essential to provide a way for the practice to get a lens into the activities of the entire organization. HIPAA also stipulates that providers be able to get Emergency Access to necessary electronic protected health information in urgent situations, and a centralized, administrative dashboard that’s available on the web can provide just that.
  1. Audit trails: A healthcare organization should be able to track every encrypted file across the entire organization. That means logging every modification, copy, access, or share operation made to encrypted files—and associating each with a particular user.
  1. Integrity control: HIPAA rules mandate that providers be able to ensure that ePHI security hasn’t been compromised. Often, that’s an element of the audit trails. But it also means that providers should be able to preserve a complete history of confidential files to help track and recover any changes made to those files over time. This is where encryption can play a helpful role too: Encryption can render it impossible to modify files without access to the private encryption keys.
  1. Device loss / theft protection: The Amedisys situation illustrates the real risk posed by lost and stolen devices. Amedisys took the important first step of encrypting sensitive files. But it isn’t the only one to take. When a device is lost or stolen, it might seem like there’s little to be done. But steps can and should be taken to decrease the impact a breach in progress. Certain cloud security solutions provide a device block feature, which administrators can use to remotely wipe the keys associated with certain devices and users so that the sensitive information can no longer be accessed. Automatic logoff also helps, because terminating a session after a period of inactivity can help prevent unauthorized access.
  1. Employee termination help: Procedures should be implemented to prevent terminated employees from accessing ePHI. But the ability to physically block a user from accessing information takes it a step further. Technical tools such as a button that revokes or changes access permission in real-time can make a big impact.

Of course encryption is still fundamental to HIPAA compliance. In fact, it should be at the center of any sound security policy—but it’s not the only step to be taken. The right solution for your practice will integrate each of these security measures to help ensure HIPAA compliance—and overall cyber security.

About Asaf Cidon
Asaf Cidon is CEO and co-founder of cloud security company Sookasa, which encrypts, audits and controls access to files on Dropbox and connected devices, and complies with HIPAA and other regulations. Cidon holds a Ph.D. from Stanford University, where he specialized in mobile and cloud computing.

The Future Of…Healthcare Security

Posted on March 13, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

This post is part of the #HIMSS15 Blog Carnival which explores “The Future of…” across 5 different healthcare IT topics.

Security is on the top of mind of most healthcare boards. I think the instruction from these boards to CIOs is simple: Keep Us Out of the News!

That’s an order that’s much easier said than done. If Google and Anthem can’t stay out of the news because of a breach, then a hospital or doctor’s office is fighting an uphill battle. Still don’t believe me, check out this visualization of internet attacks. It’s pretty scary stuff.

The reality is that you don’t really win a security battle. You can just defend against attacks as well as possible with the limited resources you have available. What is clear is that while still limited, healthcare will be investing more resources in security and privacy than they’ve ever done before.

The future of effective security in healthcare is going to be organizations who bake security into everything they do. Instead of hiring a chief security officer that worries about and advocates for security, we need a culture of security in healthcare organizations. This starts at the top where the leader is always asking about how we’re addressing security. That leadership will then trickle down into the culture of a company.

Let’s also be clear that security doesn’t have to be at odds with innovation and technology. In fact, technology can take our approach to security and privacy to the next level. Tell me how you knew who read the chart in a paper chart world? Oh yes, that sign out sheet that people always forgot to sign. Oh wait, the fingerprints on the chart were checked. It’s almost ludicrous to think about. Let’s be real. In the paper chart world we put in processes to try to avoid the wrong people getting their hands on the chart, but we really had no idea who saw it. The opposite is true in an EHR world. We know exactly who saw what and who changed what and when and where (Note: Some EHR are better than others at this, but a few lawsuits will get them all up to par on it).

The reality is that technology can take security and privacy to another level that we could have never dreamed. We can implement granular access controls that are hard and fast and monitored and audited. That’s a powerful part of the future of security and privacy in healthcare. Remember that many of the healthcare breaches come from people who have a username and password and not from some outside hacker.

A culture of security and privacy embraces the ability to track when and what happens to every piece of PHI in their organization. Plus, this culture has to be built into the procurement process, the implementation process, the training process, etc. Gone are the days of the chief security officer scapegoat. Technology is going to show very clearly who is responsible.

While I’ve described a rosy future built around a culture of privacy and security, I’m not naive. The future of healthcare security also includes a large number of organizations who continue to live a security life of “ignorance is bliss.” These people will pay lip service to privacy and security, but won’t actually address the culture change that’s needed to address privacy and security. They’ll continue the “Just Enough Culture of HIPAA Compliance.”

In the future we’ll have to be careful to not include one organization’s ignorance in a broad description of healthcare in general. A great example of this can be learned from the Sutter Health breach. In this incident, Sutter Health CPMC found the breach during a proactive audit of their EHR. Here’s the lesson learned from that breach:

The other lesson we need to take from this HIPAA breach notification is that we shouldn’t be so quick to judge an organization that proactively discovers a breach. If we’re too punitive with healthcare organizations that find and effectively address a breach like this, then organizations will stop finding and reporting these issues. We should want healthcare organizations that have a culture and privacy and security. Part of that culture is that they’re going to sometimes catch bad actors which they need to correct.

Healthcare IT software like EHRs have a great ability to track everything that’s done and they’re only going to get better at doing it. That’s a good thing and healthcare information security and privacy will benefit from it. We should encourage rather than ridicule organizations like CPMC for their proactive efforts to take care of the privacy of their patients’ information. I hope we see more organizations like Sutter Health who take a proactive approach to the security and privacy of healthcare information.

In fact the title of the blog post linked above is a warning for the future of healthcare IT: “Will Hospitals Be At Risk for HIPAA Audits If They Don’t Have HIPAA Violations?”

Security and privacy will be part of the fabric of everything we do in healthcare IT. We can’t ignore them. In order for patients to trust these healthcare apps, security will have to be a feature. Those in healthcare IT that don’t include security as a feature will be on shaky ground.

Were Anthem, CHS Cyber Security Breaches Due to Negligence?

Posted on February 19, 2015 I Written By

Katherine Rourke is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Not long ago, health insurance giant Anthem suffered a security breach of historic proportions, one which exposed personal data on as many as 80 million current and former customers. While Anthem is taking steps to repair the public relations damage, it’s beginning to look like even its $100 million cyber security insurance policy is ludicrously inadequate to address what could be an $8B to $16B problem. (That’s assuming, as many cyber security pros do, that it costs $100 to $200 per customer exposed to restore normalcy.)

But the full extent of the healthcare industry hack may be even greater than that. As information begins to filter out about what happens, a Forbes report suggests that the cyber security intrusion at Anthem may be linked to another security breach — exposing 4.5 million records — that took place less than six months months ago at Community Health Systems:

Analysis of open source information on the cybercriminal infrastructure likely used to siphon 80 million Social Security numbers and other sensitive data from health insurance giant Anthem suggests the attackers may have first gained a foothold in April 2014, nine months before the company says it discovered the intrusion. Brian KrebsAnthem Breach May Have Started in April, 2014

Class action suits against CHS were filed last August, alleging negligence by the hospital giant. Anthem also faces class action suits alleging security negligence in Indiana, California, Alabama and Georgia. But the damage to both companies’ image has already been done, damage that can’t be repaired by even the most favorable legal outcome. (In fact, the longer these cases linger in court, the more time the public has to permanently brand the defendants as having been irresponsible.)

What makes these exploits particularly unfortunate is that they may have been quite preventable. Security experts say Anthem, along with CHS, may well have been hit by a well-known and frequently leveraged vulnerability in the OpenSSL cryptographic software library known as the Heartbleed Bug. A fix for Heartbleed, which was introduced in 2011, has been available since April of last year. Though outside experts haven’t drawn final conclusions, many have surmised that neither Anthem nor CHS made the necessary fix which would  have protected them against Heartbleed.

Both companies have released defensive statements contending that these security breaches were due to tremendously sophisticated attacks — something they’d have to do even if a third-grade script kiddie hacked their infrastructure. But the truth is, note security analysts, the attacks almost certainly succeeded because of a serious lack of internal controls.

By gaining admin credentials to the database there was nothing ‒ including encryption ‒ to stop the attack. The only thing that did stop it was a lucky administrator who happened to be paying attention at the right time. Ken Westin – Senior Security Analyst at Tripwire

As much these companies would like to convince us that the cyber security breaches weren’t really their fault — that they were victims of exotic hacker gods with otherworldly skills — the bottom line is that this doesn’t seem to be true.

If Anthem and CHS going to point fingers rather than stiffen up their cyber security protocols, I’d advise that they a) buy a lot more security breach insurance and b) hire a new PR firm.  What they’re doing obviously isn’t working.

HIPAA Compliance and Windows Server 2003

Posted on February 12, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Last year, Microsoft stopped updating Windows XP and so we wrote about how Windows XP would no longer be HIPAA compliant. If you’re still using Windows XP to access PHI, you’re a braver person that I. That’s just asking for a HIPAA violation.

It turns out that Windows Server 2003 is 5 months away from Microsoft stopping to update it as well. This could be an issue for many practices who have a local EHR install on Windows Server 2003. I’d be surprised if an EHR vendor or practice management vendor was running a SaaS EHR on Windows Server 2003 still, but I guess it’s possible.

However, Microsoft just recently announced another critical vulnerability in Windows Server 2003 that uses active directory. Here are the details:

Microsoft just patched a 15-year-old bug that in some cases allows attackers to take complete control of PCs running all supported versions of Windows. The critical vulnerability will remain unpatched in Windows Server 2003, leaving that version wide open for the remaining five months Microsoft pledged to continue supporting it.

There are a lot more technical details at the link above. However, I find it really interesting that Microsoft has chosen not to fix this issue in Windows Server 2003. The article above says “This Windows vulnerability isn’t as simple as most to fix because it affects the design of core Windows functions rather than implementations of that design.” I assume this is why they’re not planning to do an update.

This lack of an update to a critical vulnerability has me asking if that means that Windows Server 2003 is not HIPAA compliant anymore. I think the answer is yes. Unsupported systems or systems with known vulnerabilities are an issue under HIPAA as I understand it. Hard to say how many healthcare organizations are still using Windows Server 2003, but this vulnerability should give them a good reason to upgrade ASAP.

Amazing Live Visualization of Internet Attacks

Posted on October 22, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I recently heard Elliot Lewis, Dell’s Chief Security Architect, comment that “The average new viruses per day is about 5-10k appearing new each day.” To be honest, I wasn’t quite sure how to process that type of volume of viruses. It felt pretty unbelievable to me even though, I figured he was right.

Today, I came across this amazing internet attack map by Norse which illustrates a small portion of the attacks that are happening on the internet in real time. I captured a screenshot of the map below, but you really need to check out the live map to get a feel for how many internet attacks are happening. It’s astounding to watch.

Norse - Internet Attack Map

For those tech nerds out there, here’s the technical description of what’s happening on the map:

Every second, Norse collects and analyzes live threat intelligence from darknets in hundreds of locations in over 40 countries. The attacks shown are based on a small subset of live flows against the Norse honeypot infrastructure, representing actual worldwide cyber attacks by bad actors. At a glance, one can see which countries are aggressors or targets at the moment, using which type of attacks (services-ports).

It’s worth noting that these are the attacks that are happening. Just because something is getting attacked doesn’t mean that the attack was successful. A large majority of the attacks aren’t successful. However, when you see the volume of attacks (and that map only shows a small portion of them) is so large, you only need a small number of them to be successful to wreak a lot of havoc.

If this type of visualization doesn’t make you stop and worry just a little bit, then you’re not human. There’s a lot of crazy stuff going on out there. It’s actually quite amazing that with all the crazy stuff that’s happening, the internet works as well as it does.

Hopefully this visualization will wake up a few healthcare organizations to be just a little more serious about their IT security.

How Secure Are Wearables?

Posted on October 1, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

JaneenB asks a really fantastic question in this tweet. Making sure that wearables are secure is going to be a really hot topic. Yesterday, I was talking with Mac McMillan from Cynergistek and he suggested that the FDA was ready to make medical device security a priority. I’ll be interested to see what the FDA does to try and regulate security in medical devices, but you can see why this is an important thing. Mac also commented that while it’s incredibly damaging for someone to hack a pacemaker like the one Vice President Cheney had (has?), the bigger threat is the 300 pumps that are installed in a hospital. If one of them can be hacked, they all can be hacked and the process for updating them is not simple.

Of course, Mac was talking about medical device security from more of an enterprise perspective. Now, let’s think about this across millions of wearable devices that are used by consumers. Plus, many of these consumer wearable devices don’t require FDA clearance and so the FDA won’t be able to impose more security restrictions on them.

I’m not really sure the answer to this problem of wearable security. Although, I think two steps in the right direction could be for health wearable companies to first build a culture of security into their company and their product. This will add a little bit of expense on the front end, but it will more than pay off on the back end when they avoid security issues which could literally leave the company in financial ruins. Second, we could use some organization to take on the effort of reporting on the security (or lack thereof) of these devices. I’m not sure if this is a consumer reports type organization or a media company. However, I think the idea of someone holding organizations accountable is important.

We’re definitely heading towards a world of many connected devices. I don’t think we have a clear picture of what this means from a security perspective.

Complete Health IT Security is a Myth, But That Doesn’t Mean We Shouldn’t Try

Posted on August 11, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

As I mentioned, last week I had the opportunity to attend the Black Hat conference in Las Vegas. There were over 9000 attendees and 180+ speakers sharing on the latest and greatest IT security and privacy topics. Black Hat is more appropriately called a hackers conference (although Defcon is more hardcore hacker than Black Hat which had plenty of corporate prensence) for good reason. You turn off your devices and be careful what you do. There’s a certain paranoia that comes when one of the vendor handouts is a foil credit card cover that prevents someone from stealing your credit card number. I didn’t quite have my tin foil hat on, but you could start to understand the sentiment.

One of the most interesting things about Black Hat is to get an idea of the mentality of the hacker. Their creative process is fascinating. Their ability to work around obstacles is something we should all learn to incorporate into our lives. I think for most of these hackers, there’s never a mentality of something can’t be done. It’s just a question of figuring out a way to work around whatever obstacles are in their way. We could use a little more of this mentality in dealing with the challenges of healthcare.

The biggest thing I was reminded of at the event was that complete security and privacy is a myth. If someone wants to get into something badly enough, they’ll find a way. As one security expert I met told me, the only secure system is one that’s turned off, not connected to anything, and buried underground. If a computer or device is turned on, then it’s vulnerable.

The reality is that complete security shouldn’t be our goal. Our goal should be to make our systems secure enough that it’s not worth someone’s time or effort to break through the security. I can assure you that most of healthcare is far from this level of security. What a tremendous opportunity that this presents.

The first place to start in any organization is to create a culture of security and privacy. The one off efforts that most organization apply after a breach or an audit aren’t going to get us there. Instead, you have to incorporate a thoughtful approach to security into everything you do. This starts at the RFP continues through the procurement process extends into the implementation and continues on through the maintenance of the product.

Security and privacy efforts in an organization are hard to justify since they don’t increase the bottom line. This is another reason why the efforts need to be integrated into everything that’s done and not just tied to a specific budget line item. As a budget line item, it’s too easy to cut it out when budgets get tight. The good news is that a little effort throughout the process can avoid a lot of heartache later on. Ask an organization that’s had a breach or failed an audit.

Unfinished Business: More HIPAA Guidelines to Come

Posted on August 4, 2014 I Written By

The following is a guest blog post by Rita Bowen, Sr. Vice President of HIM and Privacy Officer at HealthPort.

After all of the hullabaloo since the U.S. Department of Health and Human Services (HHS) Office for Civil Rights (OCR) release of the HIPAA Omnibus, it’s humbling to realize that the work is not complete. While the Omnibus covered a lot of territory in providing new guidelines for the privacy and security of electronic health records, the Final Rule failed to address three key pieces of legislation that are of great relevance to healthcare providers.

The three areas include the “minimum necessary” standard; whistleblower compensation; and revised parameters for electronic health information (EHI) access logs. No specific timetable has been provided for the release of revised legislation.

Minimum Necessary

The minimum necessary standard requires providers to “take reasonable steps to limit the use or disclosure of, and requests for, protected health information to the minimum necessary to accomplish the intended purpose.”

This requires that the intent of the request and the review of the health information be matched to assure that only the minimum information intended for the authorized release be provided. To date, HHS has conducted a variety of evaluations and is in the process of assessing that data.

Whistleblower Compensation

The second bit of unfinished legislation is a proposed rule being considered by HHS that would dramatically increase the payment to Medicare fraud whistleblowers. If adopted, the program, called the Medicare Incentive Reward Program (IRP), will raise payments from a current maximum of $1,000 to nearly $10 million.

I believe that the added incentive will create heightened sensitivity to fraud and that more individuals will be motivated to act. People are cognizant of fraudulent situations but they have lacked the incentive to report, unless they are deeply disgruntled.

Per the proposed plan, reports of fraud can be made by simply making a phone call to the correct reporting agency which should facilitate whistleblowing.

Access Logs

The third, and most contentious, area of concern is with EHI access logs. The proposed legislation calls for a single log to be created and provided to the patient, that would contain all instances of access to the patient’s EHI, no matter the system or situation.

From a patient perspective, the log would be unwieldy, cumbersome and extremely difficult to decipher for the patient’s needs. An even more worrisome aspect is that of the privacy of healthcare workers.

Employees sense that their own privacy would be invaded if regulations require that their information, including their names and other personal identifiers, are shared as part of the accessed record.  Many healthcare workers have raised concern regarding their own safety if this information is openly made available. This topic has received a tremendous amount of attention.

In discussion are alternate plans that would negotiate the content of access logs, tailoring them to contain appropriate data regarding the person in question by the patient while still satisfying patients and protecting the privacy of providers.

The Value of Data Governance

Most of my conversations circle back to the value of information (or data) governance. This situation of unfinished EHI design and management is no different. Once released the new legislation for the “minimum necessary” standard, whistleblower compensation and revised parameters for medical access logs must be woven into your existing information governance plan.

Information governance is authority and control—the planning, monitoring and enforcement—of your data assets, which could be compromised if all of the dots are not connected. Organizations should be using this time to build the appropriate foundation to their EHI.

About the Author:
Rita Bowen, MA, RHIA, CHPS, SSGB

Ms. Bowen is a distinguished professional with 20+ years of experience in the health information management industry.  She serves as the Sr. Vice President of HIM and Privacy Officer of HealthPort where she is responsible for acting as an internal customer advocate.  Most recently, Ms. Bowen served as the Enterprise Director of HIM Services for Erlanger Health System for 13 years, where she received commendation from the hospital county authority for outstanding leadership.  Ms. Bowen is the recipient of Mentor FORE Triumph Award and Distinguished Member of AHIMA’s Quality Management Section.  She has served as the AHIMA President and Board Chair in 2010, a member of AHIMA’s Board of Directors (2006-2011), the Council on Certification (2003-2005) and various task groups including CHP exam and AHIMA’s liaison to HIMSS for the CHS exam construction (2002).

Ms. Bowen is an established speaker on diverse HIM topics and an active author on privacy and legal health records.  She served on the CCHIT security and reliability workgroup and as Chair of Regional Committees East-Tennessee HIMSS and co-chair of Tennessee’s e-HIM group.  She is an adjunct faculty member of the Chattanooga State HIM program and UT Memphis HIM Master’s program.  She also serves on the advisory board for Care Communications based in Chicago, Illinois.

Criminals Have Their Eyes on Your Patients’ Records

Posted on June 26, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

The following is a guest blog post by Art Gross, Founder of HIPAA Secure Now!
Art Gross Headshot
It’s one thing to have a laptop stolen with 8,000 patient records or for a disgruntled doctor to grab his patients’ records and start his own practice.  It’s another when the Cosa Nostra steals that information, siphons money from the patient’s bank account and turns it into a patient trafficking crime ring.  Welcome to organized crime in the age of big data.

Organized crime syndicates and gangs targeting medical practices and stealing patient information are on the rise. They’re grabbing patient names, addresses, insurance details, social security numbers, birth dates, etc., and using it to steal patients’ identities and their assets.

It’s not uncommon for the girlfriend of a gang member to infiltrate a medical practice or hospital, gain access to electronic health records, download patient information and hand it over to the offender who uses it to file false tax returns. In fact gang members often rent a hotel room and file the returns together, netting $40,000-$50,000 in one night!

Florida is hotbed for this activity and it’s spreading across the country.  In California, narcotics investigators took down a methamphetamine ring and confiscated patient information on 4,500 patients. Investigators believe the stolen information was being used to obtain prescription drugs to make the illicit drug.

Value of patient records

Stolen patient information comes with a high price tag if the medical practice is fined by HIPAA. One lost or stolen patient record is estimated at $50, compared to the price of a credit card record which fetches a dollar.  Patient records are highly lucrative. The below charts shows the value of patient information that might be sitting in an EHR system:

Amount of Patient Records Value of Patient Records
1,000 $50,000
5,000 $250,000
10,000 $500,000
100,000 $5,000,000

 
Protect your practice

Medical practices need to realize they are vulnerable to patient record theft and should take steps to reduce their risk by implementing additional security.  Here are seven steps that organizations can take to protect electronic patient information:

  1. Perform a security risk assessment – a security risk assessment is not only required for HIPAA Compliance and EHR Meaningful Use but it can identify security risks that may allow criminals to steal patient information.
  2. Screen job applicants – all job applicants should be properly screened prior to hiring and providing access to patient information. Look for criminal records, frequent job switches or anything else that might be a warning sign.
  3. Limit access to patient information – employees should have minimal access necessary to perform their jobs rather than full access to electronic health records.
  4. Audit access to patient information – every employee should use their own user ID and password; login information should not be shared. And access to patient information should be recorded, including who accessed, when, and which records they accessed.
  5. Review audit logs – organizations must keep an eye on audit logs. Criminal activity can be happening during a normal business day. Reviewing audit logs can uncover strange or unexpected activity. Let’s say an employee accesses, on average 10 patient records per day and on one particular day they retrieve 50 to 100 records.  Or records are being accessed after business hours. Both activities could be a sign of criminal activity. The key is to review audit logs regularly and look for unusual access.
  6. Security training – all employees should receive security training on how to protect patient information, and make sure they know any patient information activity is being logged and reviewed.  Knowing that employee actions are being observed should dissuade them from using patient information illegally.
  7. Limit the use of USB drives – in the past it would take a truck to steal 10,000 patient charts. Now they can easily be copied onto a small thumb/USB drive and slipped into a  doctor’s lab coat.  Organizations should limit the use of USB drives to prevent illegal activity.

The high resale value of patient information and the ability to use it to file false tax returns or acquire illegal prescriptions make it a prime target for criminals.  Medical practices need to recognize the risk and put proper IT security measures in place to keep their patient information from “securing” hefty tax refunds

About Art Gross

Art Gross co-founded Entegration, Inc. in 2000 and serves as President and CEO. As Entegration’s medical clients adopted EHR technology Gross recognized the need to help them protect patient data and comply with complex HIPAA security regulations. Leveraging his experience supporting medical practices, in-depth knowledge of HIPAA compliance and security, and IT technology, Gross started his second company HIPAA Secure Now! to focus on the unique IT requirements of medical practices.  Email Art at artg@hipaasecurenow.com.

Another View of Privacy by Dr. Deborah C. Peel, MD

Posted on June 25, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I thought the following TEDx video from Deborah C. Peel, MD, Founder and Chair of Patient Privacy Rights, would be an interesting contrast with some of the things that Andy Oram wrote in yesterday’s post titled “Not So Open: Redefining Goals for Sharing Health Data in Research“. Dr. Peel is incredibly passionate about protecting patient’s privacy and is working hard on that goal.

Dr. Peel is also trying to kick off a hashtag called #MyHealthDataIsMine. What do you think of the “hidden privacy and data breaches” that Dr. Peel talks about in the video? I look forward to hearing your thoughts on it.