Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

Big Brother Or Best Friend?

Written by:

The premise of clinical decision support (CDS) is simple and powerful: humans can’t remember everything, so enter data into a computer and let the computer render judgement. So long as the data is accurate and the rules in the computer are valid, the computer will be correct the vast majority of the time.

CDS is commonly implemented in computerized provider order entry (CPOE) systems across most order types – labs, drugs, radiology, and more. A simple example: most pediatric drugs require weight-based dosing. When physicians order drugs for pediatric patients using CPOE, the computer should validate the dose of the drug against the patient’s weight to ensure the dose is in the acceptable range. Given that the computer has all of the information necessary to calculate acceptable dose ranges, and the fact that it’s easy to accidently enter the wrong dose into the computer, CDS at the point of ordering delivers clear benefits.

The general notion of CDS – checking to make sure things are being done correctly – is the same fundamental principle behind checklists. In The Checklist Manifesto, Dr. Atul Gawande successfully argues that the challenge in medicine today is not in ignorance, but in execution. Checklists (whether paper or digital) and CDS are realizations of that reality.

CDS in CPOE works because physicians need to enter orders to do their job. But checklists aren’t as fundamentally necessary for any given procedure or action. The checklist can be skipped, and the provider can perform the procedure at hand. Thus, the fundamental problem with checklists are that they insert a layer of friction into workflows: running through the checklist. If checklists could be implemented seamlessly without introducing any additional workflow friction, they would be more widely adopted and adhered to. The basic problem is that people don’t want to go back to the same repetitive formula for tasks they feel comfortable performing. Given the tradeoff between patient safety and efficiency, checklists have only been seriously discussed in high acuity, high risk settings such as surgery and ICUs. It’s simply not practical to implement checklists for low risk procedures. But even in high acuity environments, many organizations continue to struggle implementing checklists.

So…. what if we could make checklists seamless? How could that even be done?

Looking at CPOE CDS as a foundation, there are two fundamental challenges: collecting data, and checking against rules.

Computers can already access EMRs to retrieve all sorts of information about the patient. But computers don’t yet have any ability to collect data about what providers are and aren’t physically doing at the point of are. Without knowing what’s physically happening, computers can’t present alerts based on skipped or incorrect steps of the checklist. The solution would likely be based on a Kinect-like system that can detect movements and actions. Once the computer knows what’s going on, it can cross reference what’s happening against what’s supposed to happen given the context of care delivery and issue alerts accordingly.

What’s described above is an extremely ambitious technical undertaking. It will take many years to get there. There are already a number of companies trying to addressing this in primitive forms: SwipeSense detects if providers clean their hands before seeing patients, and the CHARM system uses Kinect to detect hand movements and ensure surgeries are performed correctly.

These early examples are a harbinger of what’s to come. If preventable mistakes are the biggest killer within hospitals, hospitals need to implement systems to identify and prevent errors before they happen.

Let’s assume that the tech evolves for an omniscient benevolent computer that detects errors and issues warnings. Although this is clearly desirable for patients, what does this mean for providers? Will they become slaves to the computer? Providers already face challenges with CPOE alert fatigue. Just imagine do-anything alert fatigue.

There is an art to telling people that they’re wrong. In order to successfully prevent errors, computers will need to learn that art. Additionally, there must be a cultural shift to support the fact that when the computer speaks up, providers should listen. Many hospitals still struggle today with implementing checklists because of cultural issues. There will need to be a similar cultural shift to enable passive omniscient computers to identify errors and warn providers.

I’m not aware of any omniscient computers that watch people all day and warn them that they’re about to make a mistake. There could be such software for workers in nuclear power plants or other critical jobs in which the cost of being wrong is devastating. If you know of any such software, please leave a comment.

April 9, 2014 I Written By

Kyle is Founder and CEO of Pristine, a company in Austin, TX that develops telehealth communication tools optimized for Google Glass in healthcare environments. Prior to founding Pristine, Kyle spent years developing, selling, and implementing electronic medical records (EMRs) into hospitals. He also writes for EMR and HIPAA, TechZulu, and Svbtle about the intersections of healthcare, technology, and business. All of his writing is reproduced at kylesamani.com

Healthcare Data Centers and Cloud Solutions

Written by:

As a former system administrator that worked in a number of data centers, it’s been really interesting for me to watch the evolution of healthcare data centers and the concept of healthcare cloud solutions. I think we’re seeing a definite switch by many hospital CIOs towards the cloud and away from the hassle and expense of trying to run their own data centers. Plus, this is facilitated greatly by the increased reliability, speed, and quality of the bandwidth that’s available today. Sure, the largest institutions will still have their own data centers, but even those organizations are working with an outside data center as well.

I had a chance to sit down for a video interview with Jason Mendenhall, Executive Vice President, Cloud at Switch Supernap to discuss the changing healthcare data center and cloud environment. We cover a lot of ground in the interview including when someone should use cloud infrastructure and when they shouldn’t. We talk about the security and reliability of a locally hosted data center versus an outside data center. We also talk a little about why Las Vegas is a great place for them to have their data center.

If you’re a healthcare organization who needs a data center (Translation: All of you) or if you’re a healthcare IT company that needs to host your application (Translation: All of you), then you’ll learn a lot from this interview with Jason Mendenhall:

As a side note, the Switch Supernap’s Innevation Center is the location for the Health IT Marketing and PR Conference I’m organizing April 7-8, 2014 in Las Vegas. If you’re attending the conference, we can also set you up for a tour of the Switch Supernap while you’re in Vegas. The tour is a bit like visiting a tech person’s Disneyland. They’ve created something really amazing when it comes to data centers. I know since a secure text message company I advise, docBeat, is hosted there with one of their cloud partners Itrica.

March 4, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 15 blogs containing almost 5000 articles with John having written over 2000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 9.3 million times. John also recently launched two new companies: InfluentialNetworks.com and Physia.com, and is an advisor to docBeat. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and Google Plus. Healthcare Scene can be found on Google+ as well.

iPad Lifecycle Versus Other Tablets

Written by:

Every once in a while I like to put my old IT hat back on (I am @techguy on Twitter after all) and look at some of the more physical IT aspects of EMR and healthcare IT. I still get really excited about EMR Technology products and the evolution of these products.

I’ve long argued that most IT administrators would much rather have a set of Windows 8 tablets in their environment over a bunch of iPads or Android tablets. The biggest reasons for this was because of the security and management of these devices. Most hospital and healthcare IT administrators are comfortable securing a Windows based device and they aren’t as comfortable with new tablets like the iPad or Android tablets. Plus, the tools for managing and imaging Windows based tablets is so much more developed than those of the iPad or Android (although, I think both of these are catching up pretty quickly).

While I think both of these arguments are reasonable, I heard two new arguments for why an organization might want to stick with Windows 8 tablets instead of moving to iPads and Androids.

The first reason is that the lifecycle of a Windows 8 machine is much longer than an iPad or Android tablet. A Windows 8 tablet that you bought 5 years ago could still easily be supported by an IT shop and will work with your various software systems. A 3 year old iPad could very well not work with your EHR software and Apple has already stopped supporting O/S upgrades on the original iPads which poses similar HIPAA Compliance issues to Windows XP.

The whole release cycle with iPad and Android tablets is intent on replacing the previous versions. They don’t quite make them obsolete, but they’ve been releasing new versions every year with the intent for you to buy a new one every year. This stands in stark contrast to the Windows tablet approach.

Another reason many IT admins will likely lean towards Windows 8 tablets over iPads and Androids is that they’re just generally more rugged. Sure, you can make iPads and Androids more rugged with certain cases, but then you lose the look and feel of having an iPad in your hand and nicely in your pocket.

This point is accentuated even more when you look at devices like the new Toughpad tablets from Panasonic. They’ve finally got the processing power in these machines to match that of a desktop so they can run any software you want. Plus, they are crazy durable. I saw them at CES last month and a journalist from India was slamming it on the ground and stepping on it and the thing kept ticking without a problem. I don’t need to explain to any of you why durability matters in healthcare where you’re always carrying around multiple items and drops are common.

Of course, the reality is that it’s “sexy” to carry around an iPad while you work. Software vendors are going to continue developing for the iPad and doctors are going to want to be carrying an iPad around with them. IT staff are likely going to have to support iPads and other tablets in their environment. However, when it’s left to the IT staff, you can be sure that the majority of them will be pushing for the more rugged, easier to secure, easier to manage, and longer lifecycle Windows 8 tablets. Unless of course, they’re ordering an iPad for their own “test” environment.

February 13, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 15 blogs containing almost 5000 articles with John having written over 2000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 9.3 million times. John also recently launched two new companies: InfluentialNetworks.com and Physia.com, and is an advisor to docBeat. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and Google Plus. Healthcare Scene can be found on Google+ as well.

Barcodes, Integrating Bedside TVs, EHR, and Nurse Messaging Into Pain Management Workflow

Written by:

As I look across the healthcare IT landscape, I believe we’re just at the beginning of a real integrated solutions that leverage everything that technology can offer. However, I see it starting to happen. A good example of this was this case study I found on “Automated Workflow for Pain Management.”

The case study goes into the details of the time savings and other benefits of proper pain management in the hospital. However, I was really intrigued by their integration of bedside TVs together with barcodes, EHR software, and nurse messaging (sadly they used a pager, but that could have easily been replaced with secure messaging). What a beautiful integration and workflow across so many different technologies from different companies and this is just the start.

One major challenge to these workflows is making these external applications work with the EHR software. Hopefully things like the blog post I wrote yesterday will help solve that problem. Case studies like the one above illustrate really well the value of outside software applications being able to integrate with EHR software.

What I also loved about the above solution is that it doesn’t cause any more work for the hospital staff. In fact, in many ways it can save them time. The nurse can have much higher quality data about who needs them and when.

This implementation is also a preview of what Kyle Samani talked about in his post “Unlocking the Power of Data Science in Healthcare.” While Kyle wrote about it from the perspective of patients and getting them the right information in the right context, the same applies to healthcare providers. The case study above is an example of this shift. No doubt there will be some resistance to these technologies in healthcare, but once they get refined we’ll wonder how we lived without them.

February 7, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 15 blogs containing almost 5000 articles with John having written over 2000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 9.3 million times. John also recently launched two new companies: InfluentialNetworks.com and Physia.com, and is an advisor to docBeat. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and Google Plus. Healthcare Scene can be found on Google+ as well.

Why Will Medical Professionals Use Laptops?

Written by:

Steve Jobs famously said that “laptops are like trucks. They’re going to be used by fewer and fewer people. This transition is going to make people uneasy.”

Are medical professionals truck drivers or bike riders?

We have witnessed truck drivers turn into bike riders in almost every computing context:

Big businesses used to buy mainframes. Then they replaced mainframes with mini computers. Then they replaced minicomputers with desktops and servers. Small businesses began adopting technology in meaningful ways once they could deploy a local server and clients at reasonable cost inside their businesses. As web technologies exploded and mobile devices became increasingly prevalent, large numbers of mobile professionals began traveling with laptops, tablets and smartphones. Over the past few years, many have even stopped traveling with laptops; now they travel with just a tablet and smartphone.

Consumers have been just as fickle, if not more so. They adopted build-it-yourself computers, then Apple IIs, then mid tower desktops, then laptops, then ultra-light laptops, and now smartphones and tablets.

Mobile is the most under-hyped trend in technology. Mobile devices – smartphones, tablets, and soon, wearables – are occupying an increasingly larger percentage of total computing time. Although mobile devices tend to have smaller screens and fewer robust input methods relative to traditional PCs (see why the keyboard and mouse are the most efficient input methods), mobile devices are often preferred because users value ease of use, mobility, and access more than raw efficiency.

The EMR is still widely conceived of as a desktop-app with a mobile add-on. A few EMR companies, such as Dr Chrono, are mobile-first. But even in 2014, the vast majority of EMR companies are not mobile-first. The legacy holdouts cite battery, screen size, and lack of a keyboard as reasons why mobile won’t eat healthcare. Let’s consider each of the primary constraints and the innovations happening along each front:

Battery – Unlike every other computing component, batteries are the only component that aren’t doubling in performance every 2-5 years. Battery density continues to improve at a measly 1-2% per year. The battery challenge will be overcome through a few means: huge breakthroughs in battery density, and increasing efficiency in all battery-hungry components: screens and CPUs. We are on the verge of the transition to OLED screens, which will drive an enormous improvement in energy efficiency in screens. Mobile CPUs are also about to undergo a shift as OEM’s values change: mobile CPUs have become good enough that the majority of future CPU improvements will emphasize battery performance rather than increased compute performance.

Lack of a keyboard – Virtual keyboards will never offer the speed of physical keyboards. The laggards miss the point that providers won’t have to type as much. NLP is finally allowing people to speak freely. The problem with keyboards aren’t the characteristics of the keyboard, but rather the existential presence of the keyboard itself. Through a combination of voice, natural-language-processing, and scribes, doctors will type less and yet document more than ever before. I’m friends with CEOs of at least half a dozen companies attempting to solve this problem across a number of dimensions. Given how challenging and fragmented the technology problem is, I suspect we won’t see a single winner, but a variety of solutions each with unique compromises.

Screen size – We are on the verge of foldable, bendable, and curved screens. These traits will help resolve the screen size problem on touch-based devices. As eyeware devices blossom, screen size will become increasingly trivial because eyeware devices have such an enormous canvas to work with. Devices such as the MetaPro and AtheerOne will face the opposite problem: data overload. These new user interfaces can present extremely large volumes of robust data across 3 dimensions. They will mandate a complete re-thinking of presentation and user interaction with information at the point of care.

I find it nearly impossible to believe that laptops have more than a decade of life left in clinical environments. They simply do not accommodate the ergonomics of care delivery. As mobile devices catch up to PCs in terms of efficiency and perceived screen size, medical professionals will abandon laptops in droves.

This begs the question: what is the right form factor for medical professionals at the point of care?

To tackle this question in 2014 – while we’re still in the nascent years of wearables and eyeware computing – I will address the question “what software experiences should the ideal form factor enable?”

The ideal hardware* form factor of the future is:

Transparent: The hardware should melt away and the seams between hardware and software should blur. Modern tablets are quite svelte and light. There isn’t much more value to be had by improving portability of modern tablets; users simply can’t perceive the difference between .7lb and .8lb tablets. However, there is enormous opportunity for improvements in portability and accessibility when devices go handsfree.

Omni-present, yet invisible: There is way too much friction separating medical professionals from the computers that they’re interacting with all day long: physical distance (even the pocket is too far) and passwords. The ideal device of the future is friction free. It’s always there and always authenticated. In order to always be there, it must appear as if it’s not there. It must be transparent. Although Glass isn’t there just yet, Google describes the desired paradox eloquently when describing Glass: “It’s there when you need it, and out of sight when you don’t.” Eyeware devices will trend this way.

Interactive: despite their efficiency, PC interfaces are remarkably un-interactive. Almost all interaction boils down to a click on a pixel location or a keyboard command. Interacting with healthcare information in the future will be diverse and rich: natural physical movements, subtle winks, voice, and vision will all play significant roles. Although these interactions will require some learning (and un-learning of bad behaviors) for existing staff, new staff will pick them up and never look back.

Robust: Mobile devices of the future must be able to keep up with medical professionals. The devices must have shift-long battery life and be able to display large volumes of complex information at a glance.

Secure: This is a given. But I’ll emphasize this is as physical security becomes increasingly important in light of the number of unencrypted hospital laptops being stolen or lost.

Support 3rd party communications: As medicine becomes increasingly complex, specialized, and team-based, medical professionals will share even more information with one another, patients, and their families. Medical professionals will need a device that supports sharing what they’re seeing and interacting with.

I’m fairly convinced (and to be fair, highly biased as CEO of a Glass-centric company) that eyeware devices will define the future of computer interaction at the point of care. Eyeware devices have the potential to exceed tablets, smartphones, watches, jewelry, and laptops across every dimension above, except perhaps 3rd party communication. Eyeware devices are intrinsically personal, and don’t accommodate others’ prying eyes. If this turns out to be a major detriment, I suspect the problem will be solved through software to share what you’re seeing.

What do you think? What is the ideal form factor at the point of care?

*Software tends to dominate most health IT discussions; however, this blog post is focused on ergonomics of hardware form factors. As such, this list avoids software-centric traits such as context, intelligence, intuition, etc.

February 4, 2014 I Written By

Kyle is Founder and CEO of Pristine, a company in Austin, TX that develops telehealth communication tools optimized for Google Glass in healthcare environments. Prior to founding Pristine, Kyle spent years developing, selling, and implementing electronic medical records (EMRs) into hospitals. He also writes for EMR and HIPAA, TechZulu, and Svbtle about the intersections of healthcare, technology, and business. All of his writing is reproduced at kylesamani.com

Eyes Wide Shut – January, 2014 Meaningful Use Stage 2 Readiness Reality Check

Written by:

Happy New Year?

As I begin the 2014 Meaningful Use measures readiness assessment and vendor cat-herding exercises, I’m reflecting on this portion of CMS’s Director of E-Health Standards and Services, Robert Tagalicod and the ONC’s Acting National Coordinator Jacob Reider’s statement regarding the Meaningful Use timeline modification: “The goal of this change is two-fold: first, to allow CMS and ONC to focus efforts on the successful implementation of the enhanced patient engagement, interoperability and health information exchange requirements in Stage 2.” (Previously published on EMRandHIPAA.com.)

I call BS.

If the “goal” is a “successful implementation”, then CMS failed miserably by not addressing the START of the quarterly attestation period for Stage 2, which is still required in 2014. CMS and the ONC need more time to successfully implement the measures, and they are bureaucratic agencies that don’t directly deal with patient medical care. Why wasn’t the additional time required to truly succeed at this monumental task extended to the healthcare provider organizations? Because the agencies want to save face, and avoid litigation from early adopters who may be already beginning their 2014 attestation period amidst heroic back-breaking efforts?

Here’s a reality check for what a large IDN might be going through in early January, in preparation for the start of the 2014 quarterly attestation period. Assume this particular IDN’s hospitals’ fiscal year runs October-September, so you MUST begin your attestation period on July 1. You have 6 months.

As of December 31, 2013, only 4 of the 8 EMRs in your environment completed their 2014 CEHRT certification.

Each of those 4 EMRs has a different schedule to implement the upgrade to the certified edition, with staggered delivery dates from March to July. The hospital EMR is not scheduled to receive its certified-edition upgrade until April. You pray that THIS implementation is the exception to your extensive experience with EMR vendor target timelines extending 6-8 weeks beyond initial dates.

The EMR upgrades do not include the Direct module configuration, and the vendor’s Direct module resources are not available until 6-9 weeks after the baseline upgrade implementation – if they have knowledgeable resources, at all. Your hospital EMR vendor can’t articulate the technical infrastructure required to implement and support its own Direct module. Several vendors indicate that the Direct module configuration will have to be negotiated with a third-party. Your clinicians don’t know what Direct is. Your IT staff doesn’t know how to register with a HISP. Your EMR vendor doesn’t support a central Direct address directory or a lookup function, so you contemplate typing classes for your HIM and clinical staff.

The number of active patient problems requiring manual SNOMED remediation exceeds 60,000 records in your hospital EMR. You form a clinical committee to address, but they’re estimating it will take 6 months of review to complete. You’re contemplating de-activating all problems older than a certain date, which would whittle down the number and shorten the timeframe to complete – but would eliminate chronic conditions.

There are still nagging questions regarding CMS interpretation of the measures, so you ask for clarification, and you wait. And wait. And wait. The answers impact the business rules required for attestation reporting, and you know you need any help you can get in whittling down the denominator values. Do deceased patients count in the view/download/transmit denominator? If records access is prohibited by state/federal law, does that encounter count in the view/download/transmit denominator?

Consultant costs skyrocket as you struggle to find qualified SME resources to blaze a trail for your internal staff. Their 60-to-90-day assessments inevitably end with recommendations for “proof of concept” and “pilot” approaches to each of the 2014 measures, which don’t take into account the reality of the EMR upgrade timelines and the looming attestation start date. Following their recommendations would delay your attestation start by 9-12 months. So, your internal staff trudges forward without expert leadership, and you throw the latest PowerPoint deck from “Health IT Professionals-R-Us” on the pile.

Who needs testing, when you can go live with unproven technology the day it’s available in order to meet an arbitrary deadline? Healthcare.gov did it – look what a success that turned out to be!

But wait, this is real clinical data, generated by real-world clinical workflows, being used to treat real patients, by real healthcare providers. By refusing to address the start of the 2014 attestation period, CMS and the ONC are effectively using these patients and providers as lab rats.

I did not give permission to be part of this experiment.

January 13, 2014 I Written By

Mandi Bishop is a healthcare IT consultant and a hardcore data geek with a Master's in English and a passion for big data analytics, who fell in love with her PCjr at 9 when she learned to program in BASIC. Individual accountability zealot, patient engagement advocate, innovation lover and ceaseless dreamer. Relentless in pursuit of answers to the question: "How do we GET there from here?" More byte-sized commentary on Twitter: @MandiBPro.

Epic Builds Lab Installations At Oregon University

Written by:

Epic Systems has agreed to build two lab installations of its EpicCare EMR at the Oregon Health & Science University, one to be used for medical informatics education, and the other giving the school access its source code on the research side, reports Healthcare IT News.

Though the school’s OHSU Healthcare system already runs EpicCare for its hospitals and clinics, students and teachers have had to rely on a basic installation of the open-source VistA system for OSHU’s EMR laboratory course.

According to HIN, this is Epic’s first partnership with an academic informatics program, and potentially an important turning point for the company, which has conducted research and development almost exclusively on its Verona, Wis. campus. (It does release its source code to commercial customers.) And the agreement didn’t come easily; In fact, the school spent several years persuading Epic to participate before it agreed to commit to an academic partnership, Healthcare IT News said.

In a press statement, OSHU notes that the EpicCare research environment should allow students to delve into usability, data analytics, simulation, interoperability,  patient safety and more. The school also expects to prepare prototypes of solutions to to real-world healthcare problems.

Students in both OHSU’s on-campus and distance learning programs will pursue coursework based on the Epic EMR, with classes using the live Epic environment beginning March 2014. Work students will undertake include learning to configure screens, implementing clinical decision support and generating reports.

While this isn’t quite the same thing, this agreement brings to mind a blog item by John in which he describes how prospective programmer hires at Elation are required to shadow a physician as part of their hiring process. In both cases, the people who will be working with the software are actually getting an idea of how the product is used in the field before they’re out serving commercial clients. Sadly, that’s still rare.

I think this will ultimately be a win for both Epic and OSHU. Epic will get a fresh set of insights into its product, and students will be prepared for a real world in which Epic plays a major part.

November 27, 2013 I Written By

Katherine Rourke is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Can Cloud Computing Help Solve Healthcare’s Looming IT Crisis?

Written by:

The title of this post comes from a whitepaper called “How Cloud Computing Can Help Solve Healthcare’s Looming IT Crisis” that was done by Intel together with CareCloud and terremark (A Verizon Company). My initial reaction when reading this whitepaper was “what looming healthcare IT crisis are they talking about?”

The whitepaper makes the general case about the challenges of so much regulation, security, and privacy issues related to healthcare IT. I guess that’s the crisis that they talk about. Certainly I agree that many a healthcare CIO is overwhelmed by the rate of change that’s happened in healthcare IT to date. Is it a crisis? Maybe in some organizations.

However, more core to what they discuss in the paper is whether cloud computing can provide some benefits to healthcare that many organizations aren’t experiencing today. The whitepaper cites a CDW study that just 30 percent of medical practices have transitioned to cloud computing services. No doubt I’ve seen the reluctance of many organizations to go with cloud computing. Although, as one hospital CIO told me, we have to do it.

The whitepaper makes the case that cloud computing can help with:
-Security, compliance and privacy
-Cost efficiency and improved focus
-Flexibility and scalability

I’d love to hear your thoughts on the whitepaper and its comments on the value of cloud computing. Should healthcare be shifting everything to cloud computing? Is there a case to be made for in house over cloud computing? Will some sort of hybrid approach win out?

November 21, 2013 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 15 blogs containing almost 5000 articles with John having written over 2000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 9.3 million times. John also recently launched two new companies: InfluentialNetworks.com and Physia.com, and is an advisor to docBeat. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and Google Plus. Healthcare Scene can be found on Google+ as well.

China’s EMR Market

Written by:

Last week I wrote about what’s not happening in China: American firms getting a slice of the EMR market.

This time I thought it’d be interesting to look at what is happening with health IT in the world’s most populous country.

As I mentioned, that’s often easier said than done. The healthcare system has peculiarities, and the government doesn’t necessarily say what it’s planning. Some research firms have shied away from reporting on China’s EMR scene altogether.

But a case study released over the summer provides some fascinating market intelligence. The work by Arthur Daemmrich, associate professor at the University of Kansas School of Medicine, follows Shanghai Kingstar Winning Software Co. Ltd. as its founder seeks to increase its growth rate.

Three options that Zhou Wei was considering as the first quarter of 2013 drew to a close included continuing to grow organically, merging with another company and expanding into other geographies, including South Asia or even the United States.

Points worth noting:

  • Winning, with 1,000 employees, competed for hospital IT projects with five other large firms. A few hundred smaller companies provided more specialized offerings.
  • The government owned more than 90 percent of the country’s hospitals.
  • Winning had achieved 50 percent revenue growth in 2012 and expected the same in 2013, but Zhou was not satisfied. He felt that even more rapid growth was needed.
  • In 2008, 1 percent of China’s hospitals were using EMRs. By 2012, about 32 percent of higher-ranked hospitals — tier-II and tier-III institutions — had EMRs.
  • Medical record-keeping in China came nearly to a halt during World War II and the country’s civil war. Many leftover records were destroyed during the Cultural Revolution of the 1960s and 1970s. The country then began rebuilding its records infrastructure. Daemmrich wrote, “Outpatient visits and prescriptions were recorded on small booklets that patients kept and brought with them to the hospital or other specialized clinic. Most hospitals issued their own booklets, so patients could end up with several different sets of medical records at home.”
  • Zhou’s firm undertook a project at an 850-bed Chinese traditional medicine (TCM) hospital. At such institutions, treatments such as acupuncture and therapeutic massage are common. The company’s R&D director, Ma Wei Min, explained, “The interfaces of western medicine and TCM EMR systems are alike, because the patient flow paths at both kinds of hospitals are almost the same. But going back to the software writing stage, TCM EMRs required a different logic and very different terminology.”

It’s easy to get immersed in the health IT considerations of our own country and forget that other regions are undertaking similar efforts. In China, the goals of the EMR push are largely the same as they are in the United States, but it’s interesting how much local flavor comes into play. The fact that Winning’s founder was seeing 50 percent revenue growth but still expected more was amazing and speaks to the country’s pace of economic development. And the background on China’s record-keeping shows that the country’s task is not just to digitize processes that have long been in place, but to define exactly what a medical record is and how it should work.

October 30, 2013 I Written By

James Ritchie is a freelance writer with a focus on health care. His experience includes eight years as a staff writer with the Cincinnati Business Courier, part of the American City Business Journals network. Twitter @HCwriterJames.

Scanning Is a Feature of Healthcare IT and Will Be Forever

Written by:

When I first started writing about EMR and EHR, I regularly discussed the idea of a paperless office. What I didn’t realize at the time and what has become incredibly clear to me now is that paper will play a part in every office Forever (which I translate to my lifetime). While paper will still come into an office, that doesn’t mean you can’t have a paperless office when it comes to the storage and retrieval of those files. The simple answer to the paper is the scanner.

A great example of this point was discussed in this post by The Nerdy Nurse called “Network Scanning Makes Electronic Medical Records Work.” She provides an interesting discussion about the various scanning challenges from home health nurses to a network scanner used by multiple nurses in a hospital setting.

The good people at HITECH Answers also wrote about “Scanning and Your EHR Implementation.” Just yesterday I got an email from someone talking about how they should approach their old paper charts. It’s an important discussion that we’re still going to have for a while to come. I’m still intrigued by the Thinning Paper Charts approach to scanning, but if I could afford it I’d absolutely outsource the scanning to an outside company. They do amazing work really fast. They even offer services like clinical data abstraction so you can really enhance the value of your scanned charts.

However, even if you outsource your old paper charts, you’ll still need a heavy duty scanner for ongoing paper that enters your office. For example, I have the Canon DR-C125 sitting next to my desk and it’s a scanner that can handle the scanning load of healthcare. You’ll want a high speed scanner like this one for your scanning. Don’t try to lean on an All-in-One scanner-printer-copier. It seems like an inexpensive alternative, but the quality just isn’t the same and after a few months of heavy scanning you’ll have to buy a new All-in-One after you burn it out. Those are just made for one off scanning as opposed to the scanning you have to do in healthcare.

David Harlow also covers an interesting HIPAA angle when it comes to scanners. In many cases, scanners don’t store any PHI on the scanner. However, in some cases they do and so you’ll want to be aware of this so that the PHI stored on the device is cleaned before you dispose of it.

Certainly many organizations are overwhelmed by meaningful use, ICD-10, HIPAA Omnibus, and changing reimbursement. However, things like buying the right scanner make all the difference when it comes to the long term happiness of your users.

Sponsored by Canon U.S.A., Inc.  Canon’s extensive scanner product line enables businesses worldwide to capture, store and distribute information.

October 11, 2013 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 15 blogs containing almost 5000 articles with John having written over 2000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 9.3 million times. John also recently launched two new companies: InfluentialNetworks.com and Physia.com, and is an advisor to docBeat. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and Google Plus. Healthcare Scene can be found on Google+ as well.