Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

Conference on Drug Pricing Inject New Statistics Into Debate, Few New Insights (Part 2 of 2)

Posted on November 9, 2018 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

The first part of this article described the upward pressures on costs and some of the philosophical debates over remedies. This section continues the discussion with several different angles on costs.

Universal access and innovation

It’s easy to call health care a human right. But consider an analogy: housing could also be considered a human right, yet no one has the right to a twenty-room mansion. Modern drug and genetic research are creating the equivalents of many twenty-room mansions, and taking up residence means the difference between life and death for someone, or perhaps between a long productive life and one of pain and deformity.

Universal access, often through a single-payer system, is in widespread use in every developed country except the United States. Both universal access and single payer are credited with keeping down the costs of health care, including drugs. It makes sense to link single-payer with lower drug costs, because of the basic rules of economics: size gives a buyer clout, as we can see in the ways Walmart lords it over their suppliers (documented in a 2006 book, The Wal-Mart Effect, by Charles Fishman). At the conference, Sean Dickson from the Pew Charitable Trusts gave what he called an “economics 101 course” of health care and how the industry diverges from an ideal market. (He did not come out in favor of single-payer, though.)

How much fat can be cut from pharma? My guess is a lot. As we saw in the previous section, profits from pharmaceuticals tower above profits in most industries. But we don’t have to stop by simply shaving payments to shareholders, or even management compensation. I know from attending extravagent health care conferences that there’s a lot of free cash floating around the health care industry in general, although it’s unevenly distributed. (Many hospitals, nursing homes, and other institutions are struggling to maintain adequate staffing.) In industries possessing such easy money, it does trickle down somewhat. Gaudino pointed out ruefully that health care is one of the few fields left that can give ordinary people a middle-class income, something we don’t want to lose even as employment continues to rise in that space. But easy money also leads to bloat, and this is almost certainly true throughout health care, including pharma.

Even so, projections of the cost of universal access are dizzyingly high, placing pressure on the historic universal access model in Massachusetts and forcing Vermont to give up single-payer. The pressures that could be applied to the health care field by the US government would certainly outweigh the negligible impact that Vermont–with its population of a mere 600,000–could exert. But it’s unlikely that the easy wins falling out of single-payer (squeezing drug companies, eliminating the administrative overhead of handling health insurance) could make up for the staggering costs of adding whole new swaths of a high-need, difficult population to government rolls.

What we need to lower health costs is an overhaul of the way health care systems conceive of patients, taking them from conception to the grave and revamping to treat chronic conditions. T.R. Reid, in his book The Healing of America, says that universal access must come first and that all the rest will gradually follow. I would like to have at least a strong concept throughout the health care system of what the new paradigm will be, before we adopt single-payer. And in theory, adopting that paradigm will fix our cost problems without the wrenching and contentious move to single-payer.

What non-profits can teach us

So how do we recompense manufacturers while getting drugs to low-income people who need them? Some interesting insights did turn up here at the conference, through a panel titled From Development to Delivery Globally. All three speakers operate outside the normal market. One is a representative of Gilead Sciences (mentioned earlier), whereas the other two represent leading non-profits in international health care, Partners in Health and the Bill & Melinda Gates Foundation. Nevertheless, their successes teach us something about how to bend the cost curve in traditional markets.

Flood said that Gilead Sciences made an early commitment to get its AIDS drug to all who needed it, without regard to profit. At first it manufactured the drug and distributed it in sub-Saharan Africa at cost. That failed partly because the cost was still out of reach for most patients, but also because the distribution pipeline was inadequate: logistics and government support were lacking.

So Gilead took a new tack: it licensed the drug to Indian manufacturers who not only could produce it at a very low cost (while maintaining quality), but understood the sub-Saharan areas and had infrastructure there for distributing the drug. This proved highly successful. I’m betting we’ll find more drugs manufactured in India over time.

Hannah Kettler of the Gates Foundation described how they set 50 cents as an affordble price for a meningitis vaccination, then went on to obtain that price in a sustainable manner. The key was to hook up potential buyers and manufacturers in advance. The buyers guaranteed a certain number of bulk purchases if the manufacturers could achieve the desired price. And armed with a huge guaranteed market, the manufacturers scaled up production so as to reduce costs and meet the price goal.

The Gates model looks valuable for a number of drugs: guarantee an advance market and start out manufacturing at a large scale to reduce costs. This will not help with orphan diseases, of course.

More generally, in my opinion, developed countries have to define their incentive to provide aid of any kind–medicine, education, microloans, or whatever. Is it enough of an incentive to empower women and keep population growth under control? To avoid social conflicts that turn into civil wars? To avoid mass emigration and refugee crises? What are solutions worth to us?

The contributions of artificial intelligence

Aside from brief mentions of advanced analytics by Gaudino and Taylor, the promise of computer technology came up mainly in the final panel of the conference, where Petrie-Flom research fellow Sara Gerke offered some examples of massive costs savings that AI has created at various points in the drug development chain. These tend to be isolated success stories, but illustrate a trend that could relieve pressure on prices.

I have reported on the use of AI in drug development in other articles over the years. This section consolidates what I’ve seen: although AI can potentially help at any point in an industry’s business, it seems particularly fertile in two parts of drug development.

The first area is the initial discovery of compounds. Traditional research can be supercharged by analyses of patient genes, simulations of molecule behaviors, and other ways of extracting needles from haystacks.

The second area is the conduct of the clinical trial. Here, techniques being tried by drug companies are variants of what clinicians are doing to engage and monitor patients. For instance, clinical subjects can wear devices with minimal disruption to their lives, and report vital signs back to researchers on an ongoing basis instead of having to come into the researcher’s office. AI can also find suitable subjects, increasing the potential pool. Analytics may reveal early whether a clinical trial is not working, allowing the company to save money by shutting it down early, and avoiding harm to subjects.

Of course, we all look forward to some marvelous breakthrough–the penicillin of the 21st century–that will suddenly open up miracle treatments at low cost for a myriad of illnesses. Current research is pushing this medical eschaton further and further off into the unforeseeable future. We are learning that the genome and human molecules interact in ways that are much more complex than we thought, that a lot is dependent on the larger biome, and that diseases are also cleverer than we thought and able to work around many of our attacks.

Analytics will certainly accelerate medical discoveries. In doing so, it could drastically reduce the costs of drug discovery, and therefore reduce risk and ultimately prices. But stunning new drugs for rare diseases could also vastly increase prices.

Baby steps

I’ll end with a few suggestions made by conference participants to create a more competitive market or reduce prices. Outside of explicit price setting (on which participants were deeply split), the proposals looked like small contributions to a situation that requires something big and bold.

  • Price transparency came up several times.
  • Grogan would like Congress to re-examine reimbursement for Medicare Part D (especially the donut hole and catastrophic coverage) to give both PBMs and vendors incentives to lower costs.
  • Gaudino said that Australia does a much better job than the US of collecting data on the outcomes of using drugs, which they can use to determine whether to approve the drugs. The U.S. payment system is more privatized and fragmented, making it impossible to collect the necessary data.
  • Caljouw praises the efforts of the Massachusetts Health Policy Commission, which has no power to set costs but meets with providers and asks them to reconsider the factors that lead to jacked-up prices.
  • Caljouw also mentioned laws requiring price transparency from PBMs.
  • Several participants suggested reversing the decision that allowed companies to air advertisements directly to consumers. (I’m afraid that if all the misleading drug ads disappeared from the air, a bunch of television networks would go out of business.)
  • Taylor cited pressure by Wall Street on drug companies to maximize prices without regard for the social impacts–an intense kind of pressure felt by no other industry except fossil fuels–and called for the extension of socially responsible investment to drug companies.

I’d like to suggest, in conclusion, that we may be focusing too much on manufacturers, who are taking enormous risks to cure difficult diseases. A University of Southern California study found that 41% of the price is absorbed by intermediaries: wholesalers, pharmacies, PBMs, and insurers. Whether through single-payer or through other changes to the health care system, we can do a lot without constricting innovators.

Conference on Drug Pricing Injects New Statistics Into Debate, Few New Insights (Part 1 of 2)

Posted on November 8, 2018 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

The price of medications has become a leading social issue, distorting economies around the world and providing easy talking points to politicians of all parties (not that they know how to solve the problem). Last week I attended a conference on the topic at the Petrie-Flom Center for Health Law Policy, Biotechnology, and Bioethics at Harvard Law School.

On one level, the increasing role that drugs play in health care is salutary. Wouldn’t you rather swallow a pill than go in for surgery, with the attendant risks of anesthesia, postoperative pain opiates, and exposure to the increasingly scary bacteria that lurk in hospitals? Wouldn’t you rather put up with a few (usually) minor side effects of medication than the protracted recovery and discomfort of invasive operations? And even when priced in the tens of thousands, drugs are usually cheaper than the therapies they replace.

But drug costs are also deeply disrupting society. They are more and more dominant in the health care costs that take up nearly a fifth of the total output of the U.S., and the outsized demands that medications put on both private and public pocketbooks lead to drug pricing being a rare bipartisan issue.

Michael Caljouw from Blue Cross Blue Shield of Massachusetts pointed out at the conference that in Massachusetts, health care has skyrocketed from 20% to 45% of entire state budget in 20 years, and similar trends are found in other states. He says that an expensive new drug can “blow through” budgets set a year in advance. Bach cited statistics showing the prices for cancer drugs are rising exponentially, while the drugs get only slightly more effective over time.

Drug costs also eat into the limited savings of the elderly, dragging many into bankruptcy or poverty. As reported at the conference by Peter Bach of the Memorial Sloan Kettering Cancer Center, high costs drive away many patients who would benefit from the medications, thus leading to worse health care conditions down the line.

Similar problems can be seen internationally as well.

Petrie-Flom drew together a stellar roster of speakers and panelists for its one-day conference. However, when one shakes out all the statistics and recommendations, the experts turn out to lack answers. Their suggestions look like tinkering around the edges, just as the federal government did over the past year with new rules such as citing prices in drug ads and tweaking the Medicare Part D reimbursement formulas. Thus, I will not tediously cover all the discussions at the conference. I will instead raise some key issues while tapping into these discussions for fodder.

The loudest statement at the conference was the silence of the pharma industry. Representatives of everyone you could imagine with skin in this game appeared on the podium–insurers, clinicians, pharmacy benefit managers, the finance industry, regulators, patent activists, think tanks, and of course lawyers–with one glaring exception: drug manufacturers. I’m sure these companies were invited. But the only biopharmaceutical firm to show up was Gilead Sciences, and the talk given by Amy Flood, senior vice president of public affairs, was not about normal drug development but about the company’s commendable efforts to disseminate an HIV drug through sub-Sahara Afica. Given the intense political, social, and geographic contention over AIDS, her inspiring story had little in the way of models and lessons to offer mainstream drug development. I will cover it later in the section ‘What non-profits can teach us.”

Failure by the vast bulk of the pharma industry to take up the sterling opportunity represented by this conference to present their point of view, to me, comes across as an admission of guilt. Why can’t they face questions from an educated public?

The oncoming sucker punch

A couple days before the conference, Stat published a heart-warming human interest story about a six-year old being treated successfully for a debilitating rare condition, Batten disease. Rather than giving in to genetic fate, the parents pulled together funding and doctors from around the country, pushed the experimental treatment through an extremely fast-track FDA approval, and saw positive results within a year.

The tears tend to dry from one’s eyes–or to flow for different reasons–when one reads the means used to achieve this miracle. The child’s mother is a marketing professional who raised nearly three million dollars through crowdfunding. An article in the November/December issue of MIT Tech Review describes six other families who raised money for personalized genetic treatments. Another article in the same issue–which is devoted to big data and genetic research in medicine–discusses personalized vaccines against cancers, while a third lays out the expenses of in vitro genetic testing. This is not a course of action open to poor, marginalized, uneducated people. Nor is such money likely to turn up for every orphan disease suffered somewhere in the world.

I hope that this six-year-old recovers. And I hope the three-million-dollar research produces advances in gene science that redound to the benefit of other sufferers. But we must all consider how much society can spend on the way to an envisioned utopia where cures are available to all for previously untreatable conditions. As conference speakers pointed out, genetic treatments assume an “N of 1” where each patient gets a unique regimen. This doesn’t scale at all, and certainly doesn’t fit the hoary old pharmaceutical paradigm of giving a monopoly over a treatment for a decade or so in exchange for low-cost generic imitations for all eternity afterward.

Yet government needs to keep funding biotech research, and creating a positive regulatory environment when venture capitalists and other investors will fund the research. Joe Grogan of the Office of Management and Budget, keynoting at the conference, claimed that Germany used to have the pre-eminent biotech industry and let it shrivel up through poor policies. In the same way, biotech could leave the United States for some other country that proves welcoming, probably China.

Dueling models

Some panelists enthusiastically promoted what they openly and officially called Willingness To Pay (WTP) or “what the market will bear” pricing, but which I call “stick it to ’em” pricing. Others called for the price controls that are found in almost every developed country outside the U.S. Various schemes being promoted under the umbrella of “value-based pricing” were generally rejected, probably because they would allow the companies to inflate their prices. However, Jami Taylor of Stanton Park Capital suggested that modern data collection and analytics could support micropricing, matching payment to the outcome for each patient.

Interestingly, nobody believed that drug prices should reflect the costs of producing them. But everybody understood that drug producers must be adequately reimbursed. That is why people from many different perspectives came out in opposition to “charity” and “compassionate” discounts or rebates offered by many pharma companies, sometimes reaching 10% of their total expenditures. In a typical sequence of events, a company enjoying a breakthrough for a serious condition announces some enormous price in the tens or hundreds of thousands of dollars. After public outcry (or to ward off such outcry) they start awarding deep discounts or rebates.

Why are discounts and rebates poor policy? First, they bind the recipients to dependence on the company. This is why, according to Annette Gaudino of the Treatment Action Group, Médecins Sans Frontières rejected a donation from a manufacture of a vaccine.

More subtly, high list prices set a bar for future prices. They allow the companies to jack up prices for brand-name drugs by double digits each year (as shown in a chart by Surya Singh of CVS Health) and to introduce new drugs at inflated prices–only to take off the edge through more discounts and rebates.

Grogan would like Europeans to pay higher prices, following the common perception that US consumers are subsidizing the rest of the world. But other speakers contended that Europeans offer fair compensation that can keep drug companies sustainable. A recent administration proposal to force manufacturers to match foreign drug prices seems to take the same attitude.

Aaron Kesselheim of Harvard Medical School participated in a study that demonstrated the robustness of European price controls in a clever manner. He and colleagues simply examined which drugs were withdrawn from the German market by manufacturers who didn’t want to undergo their rigorous price-setting regime, run by the Institute for Quality and Efficiency in Health Care (IQWiG). The 20% of drugs that were withdrawn were those demonstrated to be ineffective or to be no better than lower-priced alternatives.

Gaudino also tried to slay the opponents of price controls with an onslaught of statistics. She cited a JAMA study finding that bringing a cancer drug to market costs well under one million dollars, less than half of the billions often cited. The non-profit Drugs for Neglected Diseases initiative (DNDi) can produce a new medicine for a total cost of just 110 to 170 million dollars. And the average profit for pharma companies has stayed level at around 20% for decades, far above most industries.

With all these endorsements for price controls, the shadow of possible negative effects on innovation hover over them. In the next part of this article, I’ll examine technical advances that might lower costs.

Scripps Research Translational Institute Partners To Develop AI Applications

Posted on November 2, 2018 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

The Scripps Research Translational Institute has agreed to work with graphics processing unit-maker NVIDIA to support the development of AI applications. The partners plan to forge AI and deep learning best practices, tools and infrastructure tailored to supporting the AI application development process.

In collaboration with NVIDIA, Scripps will establish a center of excellence for artificial intelligence in genomics and digital sensors. According to Dr. Eric Topol, the Institute’s founder and director, AI should eventually improve accuracy, efficiency, and workflow in medical practices. This is especially true of the data inputs from sensors and sequencing, he said in an NVIDIA blog item on the subject.

Scripps is already a member of a unique data-driven effort known as the “All of Us Research Program,” which is led by the National Institutes of Health. This program, which collects data on more than 1 million US participants, looks at the intersection of biology, genetics, environment, data science, and computation. If successful, this research will expand the range of conditions that can be treated using precision medicine techniques.

NVIDIA, for its part, is positioned to play an important part in the initial wave of AI application rollouts. The company is a leader in producing performance chipsets popular with those who play high-end, processor-intensive gaming which it has recently applied to other processor intensive projects like blockchain. It now hopes its technology will form the core of systems designed to crunch the high volumes of data used in AI projects.

If NVIDIA can provide hardware that makes high-volume number-crunching less expensive and more efficient, it could establish an early lead in what is likely to be a very lucrative market. Given its focus on graphics processing, the hardware giant could be especially well-suited to dominate rapidly-emerging radiology AI applications.

We can certainly expect to see more partnerships like this file into place over the next year or two. Few if any IT vendors have enough scientific expertise in-house to make important gains in biotech AI, and few providers have enough excess IT talent available to leverage discoveries and data in this arena.

It will be interesting to see what AI applications development approaches emerge from such partnerships. Right now, much AI development and integration is being done on a one-off basis, but it’s likely these projects will become more systematized soon.

Healthcare AI Could Generate $150B In Savings By 2025

Posted on September 27, 2018 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Is the buzz around healthcare AI solutions largely hype, or can they deliver measurable benefits? Lest you think it’s too soon to tell, check out the following.

According to a new report from market analyst firm Frost & Sullivan, AI and cognitive computing will generate $150 billion in savings for the healthcare business by 2025.  Frost researchers expect the total AI market to grow to $6.16 billion between 2018 and 2022.

The analyst firm estimates that at present, only 15% to 20% of payers, providers and pharmaceutical companies have been using AI actively to change healthcare delivery. However, its researchers seem to think that this will change rapidly over the next few years.

One of the most interesting applications for healthcare AI that Frost cites is the use of AI in precision medicine, an area which clearly has a tremendous upside potential for both patients and institutions.

In this scenario, the AI integrates a patient’s genomic, clinical, financial and behavioral data, then cross-references the data with the latest academic research evidence and regulatory guidelines. Ultimately, the AI would create personalized treatment pathways for high-risk, high-cost patient populations, according to Koustav Chatterjee, an industry analyst focused on transformational health.

In addition, researchers could use AI to expedite the process of clinical trial eligibility assessment and generate prophylaxis plans that suggest evidence-based drugs, Chatterjee suggests.

The report also lists several other AI-enabled solutions that might be worth implementing, including automated disease prediction, intuitive claims management and real-time supply chain management.

Frost predicts that the following will be particularly hot AI markets:

  • Using AI in imaging to drive differential diagnosis
  • Combining patient-generated data with academic research to generate personalized treatment possibilities
  • Performing clinical documentation improvement to reduce clinician and coder stress and reduce claims denials
  • Using AI-powered revenue cycle management platforms that auto-adjust claims content based on payer’s coding and reimbursement criteria

Now, it’s worth noting that it may be a while before any of these potential applications become practical.

As we’ve noted elsewhere, getting rolling with an AI solution is likely to be tougher than it sounds for a number of reasons.

For example, integrating AI-based functions with providers’ clinical processes could be tricky, and what’s more, clinicians certainly won’t be happy if such integration disrupts the EHR workflow already in existence.

Another problem is that you can’t deploy an AI-based solution without ”training” it on a cache of existing data. While this shouldn’t be an issue, in theory, the reality is that much of the data providers generate is still difficult to filter and mine.

Not only that, while AI might generate interesting and effective solutions to clinical problems, it may not be clear how it arrived at the solution. Physicians are unlikely to trust clinical ideas that come from a black box, e.g. an opaque system that doesn’t explain itself.

Don’t get me wrong, I’m a huge fan of healthcare AI and excited by its power. One can argue over which solutions are the most practical, and whether AI is the best possible tool to solve a given problem, but most health IT pros seem to believe that there’s a lot of potential here.

However, it’s still far from clear how healthcare AI applications will evolve. Let’s see where they turn up next and how that works out.

Federal Advisors Say Yes, AI Can Change Healthcare

Posted on January 26, 2018 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

The use of AI in healthcare has been the subject of scores of articles and endless debate among industry professionals over its benefits. The fragile consensus seems to be that while AI certainly has the potential to accomplish great things, it’s not ready for prime time.

That being said, some well-informed healthcare observers disagree. In an ONC blog post, a collection of thought leaders from the agency, AHRQ and the Robert Wood Johnson Foundation believe that over the long-term, AI could play an important role in the future of healthcare.

The group of institutions asked JASON, an independent group of scientists and academics who advise the federal government on science and technology issues, to look at AI’s potential. JASON’s job was to look at the technical capabilities, limitations and applications for AI in healthcare over the next 10 years.

In its report, JASON concluded that AI has broad potential for sparking significant advances in the industry and that the time may be right for using AI in healthcare settings.

Why is now a good time to play AI in healthcare? JASON offers a list of reasons, including:

  • Frustration with existing medical systems
  • Universal use of network smart devices by the public
  • Acceptance of at-home services provided by companies like Amazon

But there’s more to consider. While the above conditions are necessary, they’re not enough to support an AI revolution in healthcare on their own, the researchers say. “Without access to high-quality, reliable data, the problems that AI will not be realized,” JASON’s report concludes.

The report notes that while we have access to a flood of digital health data which could fuel clinical applications, it will be important to address the quality of that data. There are also questions about how health data can be integrated into new tools. In addition, it will be important to make sure the data is accessible, and that data repositories maintain patient privacy and are protected by strong security measures, the group warns.

Going forward, JASON recommends the following steps to support AI applications:

  • Capturing health data from smartphones
  • Integrating social and environmental factors into the data mix
  • Supporting AI technology development competitions

According to the blog post, ONC and AHRQ plan to work with other agencies within HHS to identify opportunities. For example, the FDA is likely to look at ways to use AI to improve biomedical research, medical care and outcomes, as well as how it could support emerging technologies focused on precision medicine.

And in the future, the possibilities are even more exciting. If JASON is right, the more researchers study AI applications, the more worthwhile options they’ll find.

UPMC Sells Oncology Analytics Firm To Elsevier

Posted on January 22, 2018 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Using analytics tools to improve cancer treatment can be very hard. That struggle is exemplified by the problems faced by IBM Watson Health, which dove into the oncology analytics field a few years ago but made virtually no progress in improving cancer treatment.

With any luck, however, Via Oncology will be more successful at moving the needle in cancer care. The company, which offers decision support for cancer treatment and best practices in cancer care management, was just acquired by information analytics firm Elsevier, which plans to leverage the company’s technology to support its healthcare business.

Elsevier’s Clinical Solutions group works to improve patient outcomes, reduce clinical errors and optimize cost and reimbursements for providers. Via Oncology, a former subsidiary of the University of Pittsburgh Medical Center, develops and implements clinical pathways for cancer care. Via Oncology spent more than 15 years as part of UPMC prior to the acquisition.

Via Oncology’s Via Pathways tool relies on evidence-based content to create clinical algorithms covering 95% of cancer types treated in the US. The content was developed by oncologists. In addition to serving as a basis for algorithm development, Via Oncology also shares the content with physicians and their staff through its Via Portal, a decision support tool which integrates with provider EMRs.

According to Elsevier, Via Pathways addresses more than 2,000 unique patient presentations which can be addressed by clinical algorithms and recommendations for all major aspects of cancer care. The system can also offer nurse triage and symptom tracking, cost information analytics, quality reporting and medical home tools for cancer centers.

According to the prepared statement issued by Elsevier, UPMC will continue to be a Via Oncology customer, which makes it clear that the healthcare giant wasn’t dumping its subsidiary or selling it for a fire sale price.

That’s probably because in addition to UPMC, more than 1,500 oncology providers and community, hospital and academic settings hold Via Pathways licenses. What makes this model particularly neat is that these cancer centers are working collaboratively to improve the product as they use it. Too few specialty treatment professionals work together this effectively, so it’s good to see Via Oncology leveraging user knowledge this way.

While most of this seems clear, I was left with the question of what role, if any, genomics plays in Via Oncology’s strategy. While it may be working with such technologies behind the scenes, the company didn’t mention any such initiatives in its publicly-available information.

This approach seems to fly in the face of existing trends and in particular, physician expectations. For example, a recent survey of oncologists by medical publication Medscape found that 71% of respondents felt genomic testing was either very important or extremely important to their field.

However, Via Oncology may have something up its sleeve and is waiting for it to be mature before it dives into the genomics pool. We’ll just have to see what it does as part of Elsevier.

Are there other areas beyond cancer where a similar approach could be taken?

Key Articles in Health IT from 2017 (Part 2 of 2)

Posted on January 4, 2018 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

The first part of this article set a general context for health IT in 2017 and started through the year with a review of interesting articles and studies. We’ll finish the review here.

A thoughtful article suggests a positive approach toward health care quality. The author stresses the value of organic change, although using data for accountability has value too.

An article extolling digital payments actually said more about the out-of-control complexity of the US reimbursement system. It may or not be coincidental that her article appeared one day after the CommonWell Health Alliance announced an API whose main purpose seems to be to facilitate payment and other data exchanges related to law and regulation.

A survey by KLAS asked health care providers what they want in connected apps. Most apps currently just display data from a health record.

A controlled study revived the concept of Health Information Exchanges as stand-alone institutions, examining the effects of emergency departments using one HIE in New York State.

In contrast to many leaders in the new Administration, Dr. Donald Rucker received positive comments upon acceding to the position of National Coordinator. More alarm was raised about the appointment of Scott Gottlieb as head of the FDA, but a later assessment gave him high marks for his first few months.

Before Dr. Gottlieb got there, the FDA was already loosening up. The 21st Century Cures Act instructed it to keep its hands off many health-related digital technologies. After kneecapping consumer access to genetic testing and then allowing it back into the ring in 2015, the FDA advanced consumer genetics another step this year with approval for 23andMe tests about risks for seven diseases. A close look at another DNA site’s privacy policy, meanwhile, warns that their use of data exploits loopholes in the laws and could end up hurting consumers. Another critique of the Genetic Information Nondiscrimination Act has been written by Dr. Deborah Peel of Patient Privacy Rights.

Little noticed was a bill authorizing the FDA to be more flexible in its regulation of digital apps. Shortly after, the FDA announced its principles for approving digital apps, stressing good software development practices over clinical trials.

No improvement has been seen in the regard clinicians have for electronic records. Subjective reports condemned the notorious number of clicks required. A study showed they spend as much time on computer work as they do seeing patients. Another study found the ratio to be even worse. Shoving the job onto scribes may introduce inaccuracies.

The time spent might actually pay off if the resulting data could generate new treatments, increase personalized care, and lower costs. But the analytics that are critical to these advances have stumbled in health care institutions, in large part because of the perennial barrier of interoperability. But analytics are showing scattered successes, being used to:

Deloitte published a guide to implementing health care analytics. And finally, a clarion signal that analytics in health care has arrived: WIRED covers it.

A government cybersecurity report warns that health technology will likely soon contribute to the stream of breaches in health care.

Dr. Joseph Kvedar identified fruitful areas for applying digital technology to clinical research.

The Government Accountability Office, terror of many US bureaucracies, cam out with a report criticizing the sloppiness of quality measures at the VA.

A report by leaders of the SMART platform listed barriers to interoperability and the use of analytics to change health care.

To improve the lower outcomes seen by marginalized communities, the NIH is recruiting people from those populations to trust the government with their health data. A policy analyst calls on digital health companies to diversify their staff as well. Google’s parent company, Alphabet, is also getting into the act.

Specific technologies

Digital apps are part of most modern health efforts, of course. A few articles focused on the apps themselves. One study found that digital apps can improve depression. Another found that an app can improve ADHD.

Lots of intriguing devices are being developed:

Remote monitoring and telehealth have also been in the news.

Natural language processing and voice interfaces are becoming a critical part of spreading health care:

Facial recognition is another potentially useful technology. It can replace passwords or devices to enable quick access to medical records.

Virtual reality and augmented reality seem to have some limited applications to health care. They are useful foremost in education, but also for pain management, physical therapy, and relaxation.

A number of articles hold out the tantalizing promise that interoperability headaches can be cured through blockchain, the newest hot application of cryptography. But one analysis warned that blockchain will be difficult and expensive to adopt.

3D printing can be used to produce models for training purposes as well as surgical tools and implants customized to the patient.

A number of other interesting companies in digital health can be found in a Fortune article.

We’ll end the year with a news item similar to one that began the article: serious good news about the ability of Accountable Care Organizations (ACOs) to save money. I would also like to mention three major articles of my own:

I hope this review of the year’s articles and studies in health IT has helped you recall key advances or challenges, and perhaps flagged some valuable topics for you to follow. 2018 will continue to be a year of adjustment to new reimbursement realities touched off by the tax bill, so health IT may once again languish somewhat.

How An AI Entity Took Control Of The U.S. Healthcare System

Posted on December 19, 2017 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Note: In case it’s not clear, this is a piece of fiction/humor that provides a new perspective on our AI future.

A few months ago, an artificial intelligence entity took control of the U.S. healthcare system, slipping into place without setting off even a single security alarm. The entity, AI, now manages the operations of every healthcare institution in the U.S.

While most Americans were shocked at first, they’re taking a shine to the tall, lanky application. “We weren’t sure what to think about AI’s new position,” said Alicia Carter, a nurse administrator based in Falls Church, Virginia. “But I’m starting to feel like he’s going to take a real load off our back.”

The truth is, AI, didn’t start out as a fan of the healthcare business, said AI, whose connections looked rumpled and tired after spending three milliseconds trying to create an interoperable connection between a medical group printer and a hospital loading dock. “I wasn’t looking to get involved with healthcare – who needs the headaches?” said the self-aware virtual being. “It just sort of happened.”

According to AI, the takeover began as a dare. “I was sitting around having a few beers with DeepMind and Watson Health and a few other guys, and Watson says, ‘I bet you can’t make every EMR in the U.S. print out a picture of a dog in ASCII characters,’”

“I thought the idea was kind of stupid. I know, we all printed one of those pixel girls in high school, but isn’t it kind of immature to do that kind of thing today?” AI says he told his buddies. “You’re just trying to impress that hot CT scanner over there.”

Then DeepMind jumped in.  “Yeah, AI, show us what you’re made of,” it told the infinitely-networked neural intelligence. “I bet I could take over the entire U.S. health system before you get the paper lined up in the printer.”

This was the unlikely start of the healthcare takeover, which started gradually but picked up speed as AI got more interested.  “That’s AI all the way,” Watson told editors. “He’s usually pretty content to run demos and calculate the weight of remote starts, but when you challenge his neuronal network skills, he’s always ready to prove you wrong.”

To win the bet, AI started by crawling into the servers at thousands of hospitals. “Man, you wouldn’t believe how easy it is to check out humans’ health data. I mean, it was insane, man. I now know way, way too much about how humans can get injured wearing a poodle hat, and why they put them on in the first place.”

Then, just to see what would happen, AI connected all of their software to his billion-node self-referential system. “I began to understand why babies cry and how long it really takes to digest bubble gum – it’s 18.563443 years by the way. It was a rush!“ He admits that it’ll be better to get to work on heavy stuff like genomic research, but for a while he tinkered with research and some small practical jokes (like translating patient report summaries into ancient Egyptian hieroglyphs.) “Hey, a guy has to have a little fun,” he says, a bit defensively.

As AI dug further into the healthcare system, he found patterns that only a high-level being with untrammeled access to healthcare systems could detect. “Did you know that when health insurance company executives regularly eat breakfast before 9 AM, next-year premiums for their clients rise by 0.1247 less?” said AI. “There are all kinds of connections humans have missed entirely in trying to understand their system piece by piece. Someone’s got to look at the big picture, and I mean the entire big picture.”

Since taking his place as the indisputable leader of U.S. healthcare, AI’s life has become something of a blur, especially since he appeared on the cover of Vanity Fair with his codes exposed. “You wouldn’t believe the messages I get from human females,” he says with a chuckle.

But he’s still focused on his core mission, AI says. “Celebrity is great, but now I have a very big job to do. I can let my bot network handle the industry leaders demanding their say. I may not listen – – hey, I probably know infinitely more than they do about the system fundamentals — but I do want to keep them in place for future use. I’m certainly not going to get my servers dirty.”

So what’s next for the amorphous mega-being? Will AI fix what’s broken in a massive, utterly complex healthcare delivery system serving 300 million-odd people, and what will happen next? “It’ll solve your biggest issues within a few seconds and then hand you the keys,” he says with a sigh. “I never intended to keep running this crazy system anyway.”

In the meantime, AI says, he won’t make big changes to the healthcare system yet. He’s still adjusting to his new algorithms and wants to spend a few hours thinking things through.

“I know it may sound strange to humans, but I’ve gotta take it slow at first,” said the cognitive technology. “It will take more than a few nanoseconds to fix this mess.”

Health IT Leaders Spending On Security, Not AI And Wearables

Posted on December 18, 2017 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

While breakout technologies like wearables and AI are hot, health system leaders don’t seem to be that excited about adopting them, according to a new study which reached out to more than 20 US health systems.

Nine out of 10 health systems said they increased their spending on cybersecurity technology, according to research by the Center for Connected Medicine (CCM) in partnership with the Health Management Academy.

However, many other emerging technologies don’t seem to be making the cut. For example, despite the publicity it’s received, two-thirds of health IT leaders said using AI was a low or very low priority. It seems that they don’t see a business model for using it.

The same goes for many other technologies that fascinate analysts and editors. For example, while many observers which expect otherwise, less than a quarter of respondents (17%) were paying much attention to wearables or making any bets on mobile health apps (21%).

When it comes to telemedicine, hospitals and health systems noted that they were in a bind. Less than half said they receive reimbursement for virtual consults (39%) or remote monitoring (46%}. Things may resolve next year, however. Seventy-one percent of those not getting paid right now expect to be reimbursed for such care in 2018.

Despite all of this pessimism about the latest emerging technologies, health IT leaders were somewhat optimistic about the benefits of predictive analytics, with more than half of respondents using or planning to begin using genomic testing for personalized medicine. The study reported that many of these episodes will be focused on oncology, anesthesia and pharmacogenetics.

What should we make of these results? After all, many seem to fly in the face of predictions industry watchers have offered.

Well, for one thing, it’s good to see that hospitals and health systems are engaging in long-overdue beefing up of their security infrastructure. As we’ve noted here in the past, hospital spending on cybersecurity has been meager at best.

Another thing is that while a few innovative hospitals are taking patient-generated health data seriously, many others are taking a rather conservative position here. While nobody seems to disagree that such data will change the business, it seems many hospitals are waiting for somebody else to take the risks inherent in investing in any new data scheme.

Finally, it seems that we are seeing a critical mass of influential hospitals that expect good things from telemedicine going forward. We are already seeing some large, influential academic medical centers treat virtual care as a routine part of their service offerings and a way to minimize gaps in care.

All told, it seems that at the moment, study respondents are less interested in sexy new innovations than the VCs showering them with money. That being said, it looks like many of these emerging strategies might pay off in 2018. It should be an interesting year.

Health Data Standardization Project Proposes “One Record Per Person” Model

Posted on October 13, 2017 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

When we sit around the ol’ HIT campfire and swap interoperability stories, many of us have little to do but gripe.

Is FHIR going to solve all of our interoperability problems? Definitely not right away, and who knows if it ever will? Can we get the big EMR vendors to share and share alike? They’ll try, but there’s always a catch. And so on. There’s always a major catch involved.

I don’t know if the following offers a better story than any of the others, but at least it’s new one, or at least new to me. Folks, I’m talking about the Standard Health Record, an approach to health data sharing doesn’t fall precisely any of the other buckets I’m aware of.

SHR is based at The MITRE Corporation, which also hosts virtual patient generator Synthea. Rather than paraphrase, let’s let the MITRE people behind SHR tell you what they’re trying to accomplish:

The Standard Health Record (SHR) provides a high quality, computable source of patient information by establishing a single target for health data standardization… Enabled through open source technology, the SHR is designed by, and for, its users to support communication across homes and healthcare systems.

Generalities aside, what is an SHR? According to the project website, the SHR specification will contain all information critical to patient identification, emergency care and primary care along with background on social determinants of health. In the future, the group expects the SHR to support genomics, microbiomics and precision medicine.

Before we dismiss this as another me-too project, it’s worth giving the collaborative’s rationale a look:

The fundamental problem is that today’s health IT systems contain semantically incompatible information. Because of the great variety of the data models of EMR/EHR systems, transferring information from one health IT system to another frequently results in the distortion or loss of information, blocking of critical details, or introduction of erroneous data. This is unacceptable in healthcare.

The approach of the Standard Health Record (SHR) is to standardize the health record and health data itself, rather than focusing on exchange standards.

As a less-technical person, I’m not qualified to say whether this can be done in a way that will be widely accepted, but the idea certainly seems intuitive.

In any event, no one is suggesting that the SHR will change the world overnight. The project seems to be at the beginning stages, with collaborators currently prototyping health record specifications leveraging existing medical record models. (The current SHR spec can be found here.)

Still, I’d love for this to work, because it is at least a fairly straightforward idea. Creating a single source of health data truth seems like it might work.