Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

A Consulting Firm Attempts a Transition to Open Source Health Software (Part 2 of 2)

Posted on September 7, 2016 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

The previous section of this article covered the history of HLN’s open source offerings. How can it benefit from this far-thinking practice to build a sustainable business?

The obvious place to turn for funding is the Centers for Disease Control, which lies behind many of the contracts signed by public health agencies. One way or another, a public health agency has to step up and pay for development. This practice is called custom-developed code in the open source policy memorandum of the federal Office of Management and Budget (p. 14 of the PDF).

The free rider problem is acute in health care. In particular, the problems faced by a now-defunct organization, Open Health Tools, were covered in another article of mine. I examined why the potential users of the software felt little inclination to pay for its development.

The best hope for sustaining HLN as an open source vendor is the customization model: when an agency needs a new feature or a customized clinical decision support rule, it contracts with HLN to develop it. Naturally, the agency could contract with anyone it wants to upgrade open source software, but HLN would be the first place to look because they are familiar with software they built originally.

Other popular models include offering support as a paid service, and building proprietary tools on top of the basic open source version (“open core”). The temptation to skim off the cream of the product and profit by it is so compelling that one of the most vocal stalwarts of the open source process, MariaDB (based on the popular MySQL database) recently broke radically from its tradition and announced a proprietary license for its primary distinguishing extension.

Support has never scaled as a business model; it’s very labor-intensive. Furthermore, it might have made sense to offer support decades ago when each piece of software posed unique integration problems. But if you create good, modern interfaces–as Arzt claims to do–you use standards that are familiar and require little guidance.

The “open core” model has also proven historically to be a weak business model. Those that use it may stay afloat, but they don’t grow the way popular open source software such as Linux or Python do. The usual explanation for this is that users don’t find the open part of the software useful enough on its own, and don’t want to contribute to it because they feel they are just helping a company build its proprietary business.

Wonks to the Rescue
It may be that Arzt–and others who want to emulate his model in health care–have to foster a policy change in governments. This is certainly starting to happen, as seen in a series of policy announcements by the US government regarding open source software. But this is a long road, and direction could easily be reversed or allowed to falter. We have already seen false starts to open source software in various Latin American governments–the decade of the 2000s saw many flowery promises these, but hardly any follow-through.

I don’t like to be cynical, but hope may lie in the crushing failures of proprietary vendors to produce usable and accurate software for health care settings. The EHR Incentive Programs under Meaningful Use poured about 28 billion dollars into moving clinicians onto electronic records, almost all of it spent on proprietary products (of course, there were also administration costs for things such as Regional Extension Centers), with little to show in quality improvements or data exchange. The government’s open source initiatives, CONNECT and Direct, got lost in the muddle of non-functional proprietary EHRs.

So the health care industry will have to try something radically new, and the institutions willing to be innovate have their fingers on the pulse of cutting-edge trends. This includes open source software. HLN may be able to ride a coming wave.

A Consulting Firm Attempts a Transition to Open Source Health Software (Part 1 of 2)

Posted on September 6, 2016 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

Open source is increasingly understood to be the future of software, because communities working together on shared needs can produce code that is at least as good as proprietary products, while representing user interests more effectively and interoperating without friction. But running an open source project is a complex task, and keeping a business going on it is absolutely perilous. In his 2001 book The Cathedral & the Bazaar, Eric S. Raymond listed half a dozen ways for businesses to profit on open source software, but today only one or two are visible in the field (and they differ from his list).

An Enduring Commitment
Noam H Arzt, president and founder of HLN Consulting, is trying to make the leap. After getting his PhD from the University of Pennsylvania and working there in various computer-related positions for 20 years, he got the health care bug–like many readers of this article–and decided to devote his career to software for public health. He first encountered the field while working on a public health project among the famous “hot spotters” of depressed Camden, New Jersey, and was inspired by the accomplishments of people in a bad area with minimal resources. Many of his company’s later projects come from the Department of Health and Mental Hygiene in New York City.

Founded in 1997, HLN Consulting has released code under an open source license for some time. It makes sense, because its clients have no reason to compete with anybody, because IT plays a crucial role in public health, and because the needs of different public health agencies overlap a great deal. Furthermore, they’re all strapped for funds. So Arzt tells me that the agency leadership is usually enthusiastic about making the software open source. It just may take a few months to persuade the agency’s lawyers, who are clueless about open source licenses, to put one in the contract.

A few agencies outside of HLN’s clients have picked up the software, though–particularly as the developers adopt modern software practices such as more modular systems and a service-oriented architecture using open, published APIs–but none have yet contributed anything back.

HLN did, however, rack up a recent win over the Immunization Calculation Engine (ICE), software that calculates and alerts clinicians about the vaccinations patients need. The software is normally used by immunization registries that serve states or large municipalities. But eClinicalWorks has also incorporated ICE into its EHR. And the Veterans Health Administration (VHA) chose ICE this year to integrate with its renowned VistA health record. HLN has invested a fair amount of its own time into preparing ICE for integration. Arzt estimates that since HLN developed ICE for a client, the company has invested at least five person-years in upgrading the software, and has received no money directly for doing so. HLN hopes to generate revenue from assisting organizations in configuring and using ICE and its clinical decision support rules, and a new support contract with VHA is the first big step.

Can You Get There From Here?
Arzt is trying now to build on the success of ICE and make a transition from a consulting firm to an open source software firm. A consulting firm typically creates products for a single customer, and has “fight and claw for every contract,” in Arzt’s words. Maintaining a steady stream of work in such firms is always challenging. In contrast, successful open source software is widely used, and the work put into the software by each contributor is repaid by all the contributions made by others. There is no doubt that HLN is developing products with broad applicability. It all makes economic sense–except that somebody actually has to foot the bill. We’ll look at possibilities in the next section of this article.

Are State Health Agencies Ready for Meaningful Use Stage 2?

Posted on September 23, 2013 I Written By

James Ritchie is a freelance writer with a focus on health care. His experience includes eight years as a staff writer with the Cincinnati Business Courier, part of the American City Business Journals network. Twitter @HCwriterJames.

As part of its public health objectives, Meaningful Use 2 requires doctors and hospitals to report sizable amounts of information.

The idea is that when significant patterns are forming — an outbreak of a certain disease, for example, or a peculiar cluster of symptoms — they’ll be apparent right away.

But someone has to be in position to receive the data.

The responsibility falls to local and public health departments. Agencies around the country should, theoretically, be preparing for the immunization records, laboratory results and other information they’ll soon be getting.

Just how many will be ready, though, remains to be seen. Many cash-strapped departments lack the IT infrastructure for what’s being asked of them — and the money allocated by the government hasn’t amounted to much, according to a 2012 American Journal of Public Health article by Drs. Leslie Lenert and David Sundwall.

In fact, the authors wrote, the federal effort “has created unfunded mandates that worsen financial strains” on health departments.

There’s a caveat, though: The mandates aren’t really mandates.

“Nothing compels them to do it” except the desire to do the right thing, said Frieda du Toit, owner of Lakeside, Calif.-based Advanced Business Software. “Some directors are interested, some are not. The lack of money is the main thing.”

In our recent interview, du Toit, whose company specializes in information management solutions for health departments, added: “One customer asked me: ‘Am I going to be punished in any way, form or fashion if I don’t support the efforts of my hospitals and care providers?”

Her firm’s Web-based Public Health Information Management System serves cities and counties throughout the United States, including in California, Texas and Connecticut.

The federal government’s goal is for public health agencies to be involved in four administrative tasks to support MU2, according to the Stage 2 Meaningful Use Public Health Reporting Task Force. The task force is a collaboration between the U.S. Centers for Disease Control and Prevention, nonprofit public health associations and public health practitioners.

The first step is to take place before the start of MU2 — that’s Oct. 1, 2013, for hospitals and Jan. 1, 2014, for individual providers.

The tasks:

  • Declaration of readiness. Public health agencies tell the Centers for Medicare & Medicaid Services what public health initiatives they can support.
  • Registration of intent. Hospitals and providers notify public health agencies in writing what objectives they seek to meet.
  • On-boarding. Medical providers work with health departments work to achieve ongoing Meaningful Use data submission.
  • Acknowledgement. Public health agencies inform providers that reportable data has been received.

For doctors and other eligible professionals, MU2 calls for ongoing submission of electronic data for immunizations. Hospitals are to submit not only immunizations but also reportable laboratory results and syndromic surveillance data.

Health care providers whose local public health departments lack the resources to support MU2 are exempt from the reporting requirements.

In Meaningful Use Stage 3, which health IT journalist Neil Versel wrote is likely to begin in 2017, “electronic health records systems with new capabilities, such as the ability to work with public health alerting systems and on-screen ‘buttons’ for submitting case reports to public health, are envisioned,” according to Lenert and Sundwall.

The authors noted: “Public health departments will be required not just to upgrade their systems once, but also to keep up with evolving changes in the clinical care system” prompted by the regulations.

They proposed cloud computing as a better way. Shared systems and remote hosting, Lenert and Sundwall suggested, could get the work done efficiently and affordably, albeit at a cost to individual jurisdictions’ autonomy.

As EMR adoption grows, it would be a shame not to take advantage of the opportunities for public health. The entire health IT effort being pushed by the federal government is, after all, geared toward improving the health of populations.

Without money for the job, though, public health agencies’ ability to support Meaningful Use will likely always be limited. It looks like a good time to think about committing significant funds, embracing cloud-based solutions or both.

De-identified Healthcare Data – Is It Really Unidentifiable

Posted on September 30, 2011 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

There’s always been some really interesting discussion about EHR vendors selling the data from their EHR software. Turns out that many EHR vendors and other healthcare entities are selling de-identified healthcare data now, but I haven’t heard much public outcry from them doing it. Is it because the public just doesn’t realize it’s happening or because the public is ok with de-identified data being sold. I’ve heard many argue that they’re happy to have their de-identified data sold if it improves public health or if it gives them a better service at a cheaper cost.

However, a study coming out of Canada has some interesting results when it comes to uniquely identifying people from de-identified data. The only data they used was date of birth, gender, and full postal code data. “When the full date of birth is used together with the full postal code, then approximately 97% of the population are unique with only one year of data.”

One thing that concerns me a little about this study is that postal code is a pretty unique identifier. Take out postal code and you’ll find much different results. Why? Cause a lot of people share the same birthday and gender. However, the article does offer a reasonable suggestion based on the results of the study:

“Most people tend to think twice before reporting their year of birth [to protect their privacy] but this report forces us all to think about the combination or the totality of data we share,” said Dr. El Emam. “It calls out the urgency for more precise and quantitative approaches to measure the different ways in which individuals can be re-identified in databases – and for the general population to think about all of the pieces of personal information which in combination can erode their anonymity.”

To me, this is the key point. It’s not about creating fear and uncertainty that has no foundation, but to consider more fully the effect on patient privacy of multiple pieces of personal information in de-identified patient data.