Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

Providers Tell KLAS That Existing EMRs Can’t Handle Genomic Medicine

Posted on November 26, 2018 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Providers are still in the early stages of applying genomics to patient care. However, at least among providers that can afford the investment, clinical genomics programs are beginning to become far more common, and as a result, we’re beginning to get a sense of what’s involved.

Apparently, one of those things might be creating a new IT infrastructure which bypasses the provider’s existing EMR to support genomics data management.

KLAS recently spoke with a number of providers about the vendors and technologies they were using to implement precision medicine. Along the way, they were able to gather some information on the best practices of the providers which can be used to roll out their own programs.

In its report, “Precision Medicine Provider Validations 2018,”  KLAS researchers assert that while precision medicine tools have become increasingly common in oncology settings, they can be useful in many other settings.

Which vendors they should consider depends on what their organization’s precision medicine objectives are, according to one VP interviewed by the research firm. “Organizations need to consider whether they want to target a specific area or expand the solutions holistically,” the VP said. “They [also] need to consider whether they will have transactional relationships with vendors or strategic partnerships.”

Another provider executive suggests that investing in specialty technology might be a good idea. “Precision medicine should really exist outside of EMRs,” one provider president/CEO told KLAS. “We should just use software that comes organically with precision medicine and then integrated with an EMR later.”

At the same time, however, don’t expect any vendor to offer you everything you need for precision medicine, a CMO advised. “We can’t build a one-size-fits-all solution because it becomes reduced to meaninglessness,” the CMO told KLAS. “A hospital CEO thinks about different things than an oncologist.”

Be prepared for a complicated data sharing and standardization process. “We are trying to standardize the genomics data on many different people in our organization so that we can speak a common language and archive data in a common system,” another CMO noted.

At the same time, though, make sure you gather plenty of clinical data with an eye to the future, suggests one clinical researcher. “There are always new drugs and new targets, and if we can’t test patients for them now, we won’t catch things later,” the researcher said.

Finally, and this will be a big surprise, brace yourself for massive data storage demands. “Every year, I have to go back to our IT group and tell them that I need another 400 terabytes,” one LIS manager told the research firm.” When we are starting to deal with 400 terabytes here and 400 terabytes there, we’re looking at potentially petabytes of storage after a very short period of time.”

If you’re like me, the suggestion that providers need to build a separate infrastructure outside the EMR to create precision medicine program is pretty surprising, but it seems to be the consensus that this is the case. Almost three-quarters of providers interviewed by KLAS said they don’t believe that their EMR will have a primary role in the future of precision medicine, with many suggesting that the EMR vendor won’t be viable going forward as a result.

I doubt that this will be an issue in the near term, as the barriers to creating a genomics program are high, especially the capital requirements. However, if I were Epic or Cerner, I’d take this warning seriously. While I doubt that every provider will manage their own genomics program directly, precision medicine will be part of all care at some point and is already having an influence on how a growing number of conditions are treated.

Conference on Drug Pricing Inject New Statistics Into Debate, Few New Insights (Part 2 of 2)

Posted on November 9, 2018 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

The first part of this article described the upward pressures on costs and some of the philosophical debates over remedies. This section continues the discussion with several different angles on costs.

Universal access and innovation

It’s easy to call health care a human right. But consider an analogy: housing could also be considered a human right, yet no one has the right to a twenty-room mansion. Modern drug and genetic research are creating the equivalents of many twenty-room mansions, and taking up residence means the difference between life and death for someone, or perhaps between a long productive life and one of pain and deformity.

Universal access, often through a single-payer system, is in widespread use in every developed country except the United States. Both universal access and single payer are credited with keeping down the costs of health care, including drugs. It makes sense to link single-payer with lower drug costs, because of the basic rules of economics: size gives a buyer clout, as we can see in the ways Walmart lords it over their suppliers (documented in a 2006 book, The Wal-Mart Effect, by Charles Fishman). At the conference, Sean Dickson from the Pew Charitable Trusts gave what he called an “economics 101 course” of health care and how the industry diverges from an ideal market. (He did not come out in favor of single-payer, though.)

How much fat can be cut from pharma? My guess is a lot. As we saw in the previous section, profits from pharmaceuticals tower above profits in most industries. But we don’t have to stop by simply shaving payments to shareholders, or even management compensation. I know from attending extravagent health care conferences that there’s a lot of free cash floating around the health care industry in general, although it’s unevenly distributed. (Many hospitals, nursing homes, and other institutions are struggling to maintain adequate staffing.) In industries possessing such easy money, it does trickle down somewhat. Gaudino pointed out ruefully that health care is one of the few fields left that can give ordinary people a middle-class income, something we don’t want to lose even as employment continues to rise in that space. But easy money also leads to bloat, and this is almost certainly true throughout health care, including pharma.

Even so, projections of the cost of universal access are dizzyingly high, placing pressure on the historic universal access model in Massachusetts and forcing Vermont to give up single-payer. The pressures that could be applied to the health care field by the US government would certainly outweigh the negligible impact that Vermont–with its population of a mere 600,000–could exert. But it’s unlikely that the easy wins falling out of single-payer (squeezing drug companies, eliminating the administrative overhead of handling health insurance) could make up for the staggering costs of adding whole new swaths of a high-need, difficult population to government rolls.

What we need to lower health costs is an overhaul of the way health care systems conceive of patients, taking them from conception to the grave and revamping to treat chronic conditions. T.R. Reid, in his book The Healing of America, says that universal access must come first and that all the rest will gradually follow. I would like to have at least a strong concept throughout the health care system of what the new paradigm will be, before we adopt single-payer. And in theory, adopting that paradigm will fix our cost problems without the wrenching and contentious move to single-payer.

What non-profits can teach us

So how do we recompense manufacturers while getting drugs to low-income people who need them? Some interesting insights did turn up here at the conference, through a panel titled From Development to Delivery Globally. All three speakers operate outside the normal market. One is a representative of Gilead Sciences (mentioned earlier), whereas the other two represent leading non-profits in international health care, Partners in Health and the Bill & Melinda Gates Foundation. Nevertheless, their successes teach us something about how to bend the cost curve in traditional markets.

Flood said that Gilead Sciences made an early commitment to get its AIDS drug to all who needed it, without regard to profit. At first it manufactured the drug and distributed it in sub-Saharan Africa at cost. That failed partly because the cost was still out of reach for most patients, but also because the distribution pipeline was inadequate: logistics and government support were lacking.

So Gilead took a new tack: it licensed the drug to Indian manufacturers who not only could produce it at a very low cost (while maintaining quality), but understood the sub-Saharan areas and had infrastructure there for distributing the drug. This proved highly successful. I’m betting we’ll find more drugs manufactured in India over time.

Hannah Kettler of the Gates Foundation described how they set 50 cents as an affordble price for a meningitis vaccination, then went on to obtain that price in a sustainable manner. The key was to hook up potential buyers and manufacturers in advance. The buyers guaranteed a certain number of bulk purchases if the manufacturers could achieve the desired price. And armed with a huge guaranteed market, the manufacturers scaled up production so as to reduce costs and meet the price goal.

The Gates model looks valuable for a number of drugs: guarantee an advance market and start out manufacturing at a large scale to reduce costs. This will not help with orphan diseases, of course.

More generally, in my opinion, developed countries have to define their incentive to provide aid of any kind–medicine, education, microloans, or whatever. Is it enough of an incentive to empower women and keep population growth under control? To avoid social conflicts that turn into civil wars? To avoid mass emigration and refugee crises? What are solutions worth to us?

The contributions of artificial intelligence

Aside from brief mentions of advanced analytics by Gaudino and Taylor, the promise of computer technology came up mainly in the final panel of the conference, where Petrie-Flom research fellow Sara Gerke offered some examples of massive costs savings that AI has created at various points in the drug development chain. These tend to be isolated success stories, but illustrate a trend that could relieve pressure on prices.

I have reported on the use of AI in drug development in other articles over the years. This section consolidates what I’ve seen: although AI can potentially help at any point in an industry’s business, it seems particularly fertile in two parts of drug development.

The first area is the initial discovery of compounds. Traditional research can be supercharged by analyses of patient genes, simulations of molecule behaviors, and other ways of extracting needles from haystacks.

The second area is the conduct of the clinical trial. Here, techniques being tried by drug companies are variants of what clinicians are doing to engage and monitor patients. For instance, clinical subjects can wear devices with minimal disruption to their lives, and report vital signs back to researchers on an ongoing basis instead of having to come into the researcher’s office. AI can also find suitable subjects, increasing the potential pool. Analytics may reveal early whether a clinical trial is not working, allowing the company to save money by shutting it down early, and avoiding harm to subjects.

Of course, we all look forward to some marvelous breakthrough–the penicillin of the 21st century–that will suddenly open up miracle treatments at low cost for a myriad of illnesses. Current research is pushing this medical eschaton further and further off into the unforeseeable future. We are learning that the genome and human molecules interact in ways that are much more complex than we thought, that a lot is dependent on the larger biome, and that diseases are also cleverer than we thought and able to work around many of our attacks.

Analytics will certainly accelerate medical discoveries. In doing so, it could drastically reduce the costs of drug discovery, and therefore reduce risk and ultimately prices. But stunning new drugs for rare diseases could also vastly increase prices.

Baby steps

I’ll end with a few suggestions made by conference participants to create a more competitive market or reduce prices. Outside of explicit price setting (on which participants were deeply split), the proposals looked like small contributions to a situation that requires something big and bold.

  • Price transparency came up several times.
  • Grogan would like Congress to re-examine reimbursement for Medicare Part D (especially the donut hole and catastrophic coverage) to give both PBMs and vendors incentives to lower costs.
  • Gaudino said that Australia does a much better job than the US of collecting data on the outcomes of using drugs, which they can use to determine whether to approve the drugs. The U.S. payment system is more privatized and fragmented, making it impossible to collect the necessary data.
  • Caljouw praises the efforts of the Massachusetts Health Policy Commission, which has no power to set costs but meets with providers and asks them to reconsider the factors that lead to jacked-up prices.
  • Caljouw also mentioned laws requiring price transparency from PBMs.
  • Several participants suggested reversing the decision that allowed companies to air advertisements directly to consumers. (I’m afraid that if all the misleading drug ads disappeared from the air, a bunch of television networks would go out of business.)
  • Taylor cited pressure by Wall Street on drug companies to maximize prices without regard for the social impacts–an intense kind of pressure felt by no other industry except fossil fuels–and called for the extension of socially responsible investment to drug companies.

I’d like to suggest, in conclusion, that we may be focusing too much on manufacturers, who are taking enormous risks to cure difficult diseases. A University of Southern California study found that 41% of the price is absorbed by intermediaries: wholesalers, pharmacies, PBMs, and insurers. Whether through single-payer or through other changes to the health care system, we can do a lot without constricting innovators.

Conference on Drug Pricing Injects New Statistics Into Debate, Few New Insights (Part 1 of 2)

Posted on November 8, 2018 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

The price of medications has become a leading social issue, distorting economies around the world and providing easy talking points to politicians of all parties (not that they know how to solve the problem). Last week I attended a conference on the topic at the Petrie-Flom Center for Health Law Policy, Biotechnology, and Bioethics at Harvard Law School.

On one level, the increasing role that drugs play in health care is salutary. Wouldn’t you rather swallow a pill than go in for surgery, with the attendant risks of anesthesia, postoperative pain opiates, and exposure to the increasingly scary bacteria that lurk in hospitals? Wouldn’t you rather put up with a few (usually) minor side effects of medication than the protracted recovery and discomfort of invasive operations? And even when priced in the tens of thousands, drugs are usually cheaper than the therapies they replace.

But drug costs are also deeply disrupting society. They are more and more dominant in the health care costs that take up nearly a fifth of the total output of the U.S., and the outsized demands that medications put on both private and public pocketbooks lead to drug pricing being a rare bipartisan issue.

Michael Caljouw from Blue Cross Blue Shield of Massachusetts pointed out at the conference that in Massachusetts, health care has skyrocketed from 20% to 45% of entire state budget in 20 years, and similar trends are found in other states. He says that an expensive new drug can “blow through” budgets set a year in advance. Bach cited statistics showing the prices for cancer drugs are rising exponentially, while the drugs get only slightly more effective over time.

Drug costs also eat into the limited savings of the elderly, dragging many into bankruptcy or poverty. As reported at the conference by Peter Bach of the Memorial Sloan Kettering Cancer Center, high costs drive away many patients who would benefit from the medications, thus leading to worse health care conditions down the line.

Similar problems can be seen internationally as well.

Petrie-Flom drew together a stellar roster of speakers and panelists for its one-day conference. However, when one shakes out all the statistics and recommendations, the experts turn out to lack answers. Their suggestions look like tinkering around the edges, just as the federal government did over the past year with new rules such as citing prices in drug ads and tweaking the Medicare Part D reimbursement formulas. Thus, I will not tediously cover all the discussions at the conference. I will instead raise some key issues while tapping into these discussions for fodder.

The loudest statement at the conference was the silence of the pharma industry. Representatives of everyone you could imagine with skin in this game appeared on the podium–insurers, clinicians, pharmacy benefit managers, the finance industry, regulators, patent activists, think tanks, and of course lawyers–with one glaring exception: drug manufacturers. I’m sure these companies were invited. But the only biopharmaceutical firm to show up was Gilead Sciences, and the talk given by Amy Flood, senior vice president of public affairs, was not about normal drug development but about the company’s commendable efforts to disseminate an HIV drug through sub-Sahara Afica. Given the intense political, social, and geographic contention over AIDS, her inspiring story had little in the way of models and lessons to offer mainstream drug development. I will cover it later in the section ‘What non-profits can teach us.”

Failure by the vast bulk of the pharma industry to take up the sterling opportunity represented by this conference to present their point of view, to me, comes across as an admission of guilt. Why can’t they face questions from an educated public?

The oncoming sucker punch

A couple days before the conference, Stat published a heart-warming human interest story about a six-year old being treated successfully for a debilitating rare condition, Batten disease. Rather than giving in to genetic fate, the parents pulled together funding and doctors from around the country, pushed the experimental treatment through an extremely fast-track FDA approval, and saw positive results within a year.

The tears tend to dry from one’s eyes–or to flow for different reasons–when one reads the means used to achieve this miracle. The child’s mother is a marketing professional who raised nearly three million dollars through crowdfunding. An article in the November/December issue of MIT Tech Review describes six other families who raised money for personalized genetic treatments. Another article in the same issue–which is devoted to big data and genetic research in medicine–discusses personalized vaccines against cancers, while a third lays out the expenses of in vitro genetic testing. This is not a course of action open to poor, marginalized, uneducated people. Nor is such money likely to turn up for every orphan disease suffered somewhere in the world.

I hope that this six-year-old recovers. And I hope the three-million-dollar research produces advances in gene science that redound to the benefit of other sufferers. But we must all consider how much society can spend on the way to an envisioned utopia where cures are available to all for previously untreatable conditions. As conference speakers pointed out, genetic treatments assume an “N of 1” where each patient gets a unique regimen. This doesn’t scale at all, and certainly doesn’t fit the hoary old pharmaceutical paradigm of giving a monopoly over a treatment for a decade or so in exchange for low-cost generic imitations for all eternity afterward.

Yet government needs to keep funding biotech research, and creating a positive regulatory environment when venture capitalists and other investors will fund the research. Joe Grogan of the Office of Management and Budget, keynoting at the conference, claimed that Germany used to have the pre-eminent biotech industry and let it shrivel up through poor policies. In the same way, biotech could leave the United States for some other country that proves welcoming, probably China.

Dueling models

Some panelists enthusiastically promoted what they openly and officially called Willingness To Pay (WTP) or “what the market will bear” pricing, but which I call “stick it to ’em” pricing. Others called for the price controls that are found in almost every developed country outside the U.S. Various schemes being promoted under the umbrella of “value-based pricing” were generally rejected, probably because they would allow the companies to inflate their prices. However, Jami Taylor of Stanton Park Capital suggested that modern data collection and analytics could support micropricing, matching payment to the outcome for each patient.

Interestingly, nobody believed that drug prices should reflect the costs of producing them. But everybody understood that drug producers must be adequately reimbursed. That is why people from many different perspectives came out in opposition to “charity” and “compassionate” discounts or rebates offered by many pharma companies, sometimes reaching 10% of their total expenditures. In a typical sequence of events, a company enjoying a breakthrough for a serious condition announces some enormous price in the tens or hundreds of thousands of dollars. After public outcry (or to ward off such outcry) they start awarding deep discounts or rebates.

Why are discounts and rebates poor policy? First, they bind the recipients to dependence on the company. This is why, according to Annette Gaudino of the Treatment Action Group, Médecins Sans Frontières rejected a donation from a manufacture of a vaccine.

More subtly, high list prices set a bar for future prices. They allow the companies to jack up prices for brand-name drugs by double digits each year (as shown in a chart by Surya Singh of CVS Health) and to introduce new drugs at inflated prices–only to take off the edge through more discounts and rebates.

Grogan would like Europeans to pay higher prices, following the common perception that US consumers are subsidizing the rest of the world. But other speakers contended that Europeans offer fair compensation that can keep drug companies sustainable. A recent administration proposal to force manufacturers to match foreign drug prices seems to take the same attitude.

Aaron Kesselheim of Harvard Medical School participated in a study that demonstrated the robustness of European price controls in a clever manner. He and colleagues simply examined which drugs were withdrawn from the German market by manufacturers who didn’t want to undergo their rigorous price-setting regime, run by the Institute for Quality and Efficiency in Health Care (IQWiG). The 20% of drugs that were withdrawn were those demonstrated to be ineffective or to be no better than lower-priced alternatives.

Gaudino also tried to slay the opponents of price controls with an onslaught of statistics. She cited a JAMA study finding that bringing a cancer drug to market costs well under one million dollars, less than half of the billions often cited. The non-profit Drugs for Neglected Diseases initiative (DNDi) can produce a new medicine for a total cost of just 110 to 170 million dollars. And the average profit for pharma companies has stayed level at around 20% for decades, far above most industries.

With all these endorsements for price controls, the shadow of possible negative effects on innovation hover over them. In the next part of this article, I’ll examine technical advances that might lower costs.

Scripps Research Translational Institute Partners To Develop AI Applications

Posted on November 2, 2018 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

The Scripps Research Translational Institute has agreed to work with graphics processing unit-maker NVIDIA to support the development of AI applications. The partners plan to forge AI and deep learning best practices, tools and infrastructure tailored to supporting the AI application development process.

In collaboration with NVIDIA, Scripps will establish a center of excellence for artificial intelligence in genomics and digital sensors. According to Dr. Eric Topol, the Institute’s founder and director, AI should eventually improve accuracy, efficiency, and workflow in medical practices. This is especially true of the data inputs from sensors and sequencing, he said in an NVIDIA blog item on the subject.

Scripps is already a member of a unique data-driven effort known as the “All of Us Research Program,” which is led by the National Institutes of Health. This program, which collects data on more than 1 million US participants, looks at the intersection of biology, genetics, environment, data science, and computation. If successful, this research will expand the range of conditions that can be treated using precision medicine techniques.

NVIDIA, for its part, is positioned to play an important part in the initial wave of AI application rollouts. The company is a leader in producing performance chipsets popular with those who play high-end, processor-intensive gaming which it has recently applied to other processor intensive projects like blockchain. It now hopes its technology will form the core of systems designed to crunch the high volumes of data used in AI projects.

If NVIDIA can provide hardware that makes high-volume number-crunching less expensive and more efficient, it could establish an early lead in what is likely to be a very lucrative market. Given its focus on graphics processing, the hardware giant could be especially well-suited to dominate rapidly-emerging radiology AI applications.

We can certainly expect to see more partnerships like this file into place over the next year or two. Few if any IT vendors have enough scientific expertise in-house to make important gains in biotech AI, and few providers have enough excess IT talent available to leverage discoveries and data in this arena.

It will be interesting to see what AI applications development approaches emerge from such partnerships. Right now, much AI development and integration is being done on a one-off basis, but it’s likely these projects will become more systematized soon.

Healthcare AI Could Generate $150B In Savings By 2025

Posted on September 27, 2018 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Is the buzz around healthcare AI solutions largely hype, or can they deliver measurable benefits? Lest you think it’s too soon to tell, check out the following.

According to a new report from market analyst firm Frost & Sullivan, AI and cognitive computing will generate $150 billion in savings for the healthcare business by 2025.  Frost researchers expect the total AI market to grow to $6.16 billion between 2018 and 2022.

The analyst firm estimates that at present, only 15% to 20% of payers, providers and pharmaceutical companies have been using AI actively to change healthcare delivery. However, its researchers seem to think that this will change rapidly over the next few years.

One of the most interesting applications for healthcare AI that Frost cites is the use of AI in precision medicine, an area which clearly has a tremendous upside potential for both patients and institutions.

In this scenario, the AI integrates a patient’s genomic, clinical, financial and behavioral data, then cross-references the data with the latest academic research evidence and regulatory guidelines. Ultimately, the AI would create personalized treatment pathways for high-risk, high-cost patient populations, according to Koustav Chatterjee, an industry analyst focused on transformational health.

In addition, researchers could use AI to expedite the process of clinical trial eligibility assessment and generate prophylaxis plans that suggest evidence-based drugs, Chatterjee suggests.

The report also lists several other AI-enabled solutions that might be worth implementing, including automated disease prediction, intuitive claims management and real-time supply chain management.

Frost predicts that the following will be particularly hot AI markets:

  • Using AI in imaging to drive differential diagnosis
  • Combining patient-generated data with academic research to generate personalized treatment possibilities
  • Performing clinical documentation improvement to reduce clinician and coder stress and reduce claims denials
  • Using AI-powered revenue cycle management platforms that auto-adjust claims content based on payer’s coding and reimbursement criteria

Now, it’s worth noting that it may be a while before any of these potential applications become practical.

As we’ve noted elsewhere, getting rolling with an AI solution is likely to be tougher than it sounds for a number of reasons.

For example, integrating AI-based functions with providers’ clinical processes could be tricky, and what’s more, clinicians certainly won’t be happy if such integration disrupts the EHR workflow already in existence.

Another problem is that you can’t deploy an AI-based solution without ”training” it on a cache of existing data. While this shouldn’t be an issue, in theory, the reality is that much of the data providers generate is still difficult to filter and mine.

Not only that, while AI might generate interesting and effective solutions to clinical problems, it may not be clear how it arrived at the solution. Physicians are unlikely to trust clinical ideas that come from a black box, e.g. an opaque system that doesn’t explain itself.

Don’t get me wrong, I’m a huge fan of healthcare AI and excited by its power. One can argue over which solutions are the most practical, and whether AI is the best possible tool to solve a given problem, but most health IT pros seem to believe that there’s a lot of potential here.

However, it’s still far from clear how healthcare AI applications will evolve. Let’s see where they turn up next and how that works out.

A Whole New Way of Being Old: Book Review of The New Mobile Age

Posted on March 15, 2018 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

The recently released overview of health care for the aging by Dr. Joseph Kvedar and his collaborators, The New Mobile Age: How Technology Will Extend the Healthspan and Optimize the Lifespan, is aimed at a wide audience of people who can potentially benefit: health care professionals and those who manage their clinics and hospitals, technologists interested in succeeding in this field, and policy makers. Your reaction to this book may depend on how well you have asserted the impact of your prefrontal cortex over your amygdala before reading the text–if your mood is calm you can see numerous possibilities and bright spots, whereas if you’re agitated you will latch onto the hefty barriers in the way.

Kvedar highlights, as foremost among the culture changes needed to handle aging well, is a view of aging as a positive and productive stage of life. Second to that comes design challenges: technologists must make devices and computer interfaces that handle affect, adapt smoothly to different individuals and their attitudes, and ultimately know both when to intervene and how to present healthy options. As an example, Chapter 8 presents two types of robots, one of which was accepted more by patients when it was “serious” and the other when it was “playful.” The nuances of interface design are bewildering.

The logical argument in The New Mobile Age proceeds somewhat like this:

  1. Wholesome and satisfying aging is possible, but particularly where chronic conditions are involved, it involves maintaining a healthful and balanced lifestyle, not just fixing disease.

  2. Support for health, particularly in old age, thus involves public health and socio-economic issues such as food, exercise, and especially social contacts.

  3. Each person requires tailored interventions, because his or her needs and desires are unique.

  4. Connected technology can help, but must adapt to the conditions and needs of the individual.

The challenges of health care technology emerged in my mind, during the reading of this book, as a whole new stage of design. Suppose we broadly and crudely characterize the first 35 years of computer design as number-crunching, and the next 35 years–after the spread of the personal computer–as one of augmenting human intellect (a phrase popularized by pioneer Douglas Engelbart).

We have recently entered a new era where computers use artificial intelligence for decision-making and predictions, going beyond what humans can anticipate or understand. (For instance, when I pulled up The New Mobile Age on Amazon.com, why did it suggest I check out a book about business and technology that I have already read, Machine, Platform, Crowd? There is probably no human at Amazon.com or elsewhere who could explain the algorithm that made the connection.)

So I am suggesting that an equally momentous shift will be required to fulfill Kvedar’s mandate. In addition to the previous tasks of number-crunching, augmenting human intellect, and predictive analytics, computers will need to integrate with human life in incredibly supple, subtle ways.

The task reminds me of self-driving cars, which business and tech observers assure us will replace human drivers in a foreseeable time span. As I write this paragraph, snow from a nor’easter is furiously swirling through the air. It is hard to imagine that any intelligence, whether human, AI, or alien, can safely navigate a car in that mess. Self-driving cars won’t catch on until computers can instantly handle real-world conditions perfectly–and that applies to technology for the aging too.

This challenge applies to physical services as well as emotional ones. For instance, Kvedar suggests in Chapter 8 that a robot could lift a person from a bed to a wheelchair. That’s obviously riskier and more nuanced than carting goods around a warehouse. And that robot is supposed to provide encouragement, bolster the spirits of the patient, and guide the patient toward healthful behavior as well.

Although I have no illusions about the difficulty of the tasks set before computers in health care, I believe the technologies offer enormous potential and cheer on the examples provided by Kvedar in his book. It’s important to note that the authors, while delineating the different aspects of conveying care to the aging, always start with a problem and a context, taking the interests of the individual into account, and then move to the technical parts of the solution.

Therefore, Kvedar brings us face to face with issues we cannot shut our eyes to, such as the widening gap between the increasing number of elderly people in the world and the decreasing number of young people who can care for them or pay for such care. A number of other themes appear that will be familiar to people following the health care field: the dominance of lifestyle-related chronic conditions among our diseases, the clunkiness and unfriendliness of most health-related systems (most notoriously the electronic health record systems used by doctors), the importance of understanding the impact of behavior and phenotypical data on health, but also the promise of genetic sequencing, and the importance of respecting the dignity and privacy of the people whose behavior we want to change.

And that last point applies to many aspects of accommodating diverse populations. Although this book is about the elderly, it’s not only they who are easily infantilized, dismissed, ignored, or treated inappropriately in the health care system: the same goes for the mentally ill, the disabled, LGBTQ people, youth, and many other types of patients.

The New Mobile Age highlights exemplary efforts by companies and agencies to use technology to meet the human needs of the aging. Kvedar’s own funder, Partners Healthcare, can afford to push innovation in this area because it is the dominant health care provider in the Boston area (where I live) and is flush with cash. When will every institution do these same things? The New Mobile Age helps to explain what we need in order to get to that point.

Federal Advisors Say Yes, AI Can Change Healthcare

Posted on January 26, 2018 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

The use of AI in healthcare has been the subject of scores of articles and endless debate among industry professionals over its benefits. The fragile consensus seems to be that while AI certainly has the potential to accomplish great things, it’s not ready for prime time.

That being said, some well-informed healthcare observers disagree. In an ONC blog post, a collection of thought leaders from the agency, AHRQ and the Robert Wood Johnson Foundation believe that over the long-term, AI could play an important role in the future of healthcare.

The group of institutions asked JASON, an independent group of scientists and academics who advise the federal government on science and technology issues, to look at AI’s potential. JASON’s job was to look at the technical capabilities, limitations and applications for AI in healthcare over the next 10 years.

In its report, JASON concluded that AI has broad potential for sparking significant advances in the industry and that the time may be right for using AI in healthcare settings.

Why is now a good time to play AI in healthcare? JASON offers a list of reasons, including:

  • Frustration with existing medical systems
  • Universal use of network smart devices by the public
  • Acceptance of at-home services provided by companies like Amazon

But there’s more to consider. While the above conditions are necessary, they’re not enough to support an AI revolution in healthcare on their own, the researchers say. “Without access to high-quality, reliable data, the problems that AI will not be realized,” JASON’s report concludes.

The report notes that while we have access to a flood of digital health data which could fuel clinical applications, it will be important to address the quality of that data. There are also questions about how health data can be integrated into new tools. In addition, it will be important to make sure the data is accessible, and that data repositories maintain patient privacy and are protected by strong security measures, the group warns.

Going forward, JASON recommends the following steps to support AI applications:

  • Capturing health data from smartphones
  • Integrating social and environmental factors into the data mix
  • Supporting AI technology development competitions

According to the blog post, ONC and AHRQ plan to work with other agencies within HHS to identify opportunities. For example, the FDA is likely to look at ways to use AI to improve biomedical research, medical care and outcomes, as well as how it could support emerging technologies focused on precision medicine.

And in the future, the possibilities are even more exciting. If JASON is right, the more researchers study AI applications, the more worthwhile options they’ll find.

UPMC Sells Oncology Analytics Firm To Elsevier

Posted on January 22, 2018 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Using analytics tools to improve cancer treatment can be very hard. That struggle is exemplified by the problems faced by IBM Watson Health, which dove into the oncology analytics field a few years ago but made virtually no progress in improving cancer treatment.

With any luck, however, Via Oncology will be more successful at moving the needle in cancer care. The company, which offers decision support for cancer treatment and best practices in cancer care management, was just acquired by information analytics firm Elsevier, which plans to leverage the company’s technology to support its healthcare business.

Elsevier’s Clinical Solutions group works to improve patient outcomes, reduce clinical errors and optimize cost and reimbursements for providers. Via Oncology, a former subsidiary of the University of Pittsburgh Medical Center, develops and implements clinical pathways for cancer care. Via Oncology spent more than 15 years as part of UPMC prior to the acquisition.

Via Oncology’s Via Pathways tool relies on evidence-based content to create clinical algorithms covering 95% of cancer types treated in the US. The content was developed by oncologists. In addition to serving as a basis for algorithm development, Via Oncology also shares the content with physicians and their staff through its Via Portal, a decision support tool which integrates with provider EMRs.

According to Elsevier, Via Pathways addresses more than 2,000 unique patient presentations which can be addressed by clinical algorithms and recommendations for all major aspects of cancer care. The system can also offer nurse triage and symptom tracking, cost information analytics, quality reporting and medical home tools for cancer centers.

According to the prepared statement issued by Elsevier, UPMC will continue to be a Via Oncology customer, which makes it clear that the healthcare giant wasn’t dumping its subsidiary or selling it for a fire sale price.

That’s probably because in addition to UPMC, more than 1,500 oncology providers and community, hospital and academic settings hold Via Pathways licenses. What makes this model particularly neat is that these cancer centers are working collaboratively to improve the product as they use it. Too few specialty treatment professionals work together this effectively, so it’s good to see Via Oncology leveraging user knowledge this way.

While most of this seems clear, I was left with the question of what role, if any, genomics plays in Via Oncology’s strategy. While it may be working with such technologies behind the scenes, the company didn’t mention any such initiatives in its publicly-available information.

This approach seems to fly in the face of existing trends and in particular, physician expectations. For example, a recent survey of oncologists by medical publication Medscape found that 71% of respondents felt genomic testing was either very important or extremely important to their field.

However, Via Oncology may have something up its sleeve and is waiting for it to be mature before it dives into the genomics pool. We’ll just have to see what it does as part of Elsevier.

Are there other areas beyond cancer where a similar approach could be taken?

Doctors, Data, Diagnoses, and Discussions: Achieving Successful and Sustainable Personalized/Precision Medicine

Posted on January 10, 2018 I Written By

The following is a guest blog post by Drew Furst, M.D., Vice President Clinical Consultants at Elsevier Clinical Solutions.

Personalized/precision medicine is a growing field and that trend shows no sign of slowing down.

In fact, a 2016 Grand View Research report estimated the global personalized medicine market was worth $1,007.88 billion in 2014, with projected growth to reach $2,452.50 billion by 2022.

As these areas of medicine become more commonplace, understanding the interactions between biological factors with a range of personal, environmental and social impacts on health is a vital step towards achieving sustainable success.

A better understanding begins with answering important questions such as whether the focus should be precision population medicine (based on disease) or precision patient-specific medicine (based on the individual).

Specificity in terminology is needed. The traditional term of “personalized medicine” has evolved into the term “precision medicine,” but this new usage requires a more detailed look into the precise science of genetic, environmental and lifestyle factors that influence any approach to treatment.

Comprehending the interactions between biological factors with a range of personal, environmental, and social impacts on health can provide insights into success and we’ve learned that some areas of precision medicine are more effective than others.

Through pharmacogenomics – the study of understanding how a patient’s genetic make-up affects the response to a particular drug – we have identified key enzymes in cancer formation and cancer treatment, which aids in the customization of drugs.

Research shows us that drug-metabolizing enzyme activity is one of many factors that impact a patient’s response to medication. We also know that human cytochrome P450 (CYP) plays an important role in the metabolism of drugs and environmental chemicals.

Therapies that incorporate drug-specific pharmacogenomics are a boon to oncology treatments and a vast improvement over the “shotgun therapy” approach of the past. Today, treatments can be targeted to enzymes and receptors that vary from person to person.

In traditional chemotherapy, a drug developed to kill rapidly growing cancer cells will indiscriminately target other rapidly growing cells such as hair cells, hence the often-observed hair loss. However, a targeted drug and delivery method aimed at only the receptive cells can be a much more effective approach and treatment, while minimizing collateral damage.

Recently, the journal Nature published a study showing the promise this method holds.  In the pilot study, scientists led by Dr. Catherine Wu of Dana-Farber Cancer Institute in Boston gave six melanoma patients an experimental, custom-made vaccine and, two years later, all were tumor-free following treatment.

Looking Beyond Genetics

Precision medicine needs to include more than just genetics.

Factors such as environment and socio-economic status also must be included when approaching disease states and we must undertake a comprehensive overview of a patient’s situation, including, but not limited to, family history.

Cultural dietary traditions can play into disease susceptibility. As an example, the frequent consumption of smoked fish in some Asian cultures increases their risk of gastric (stomach) cancers. Lower socioeconomic status can force acceptance of substandard and overcrowded housing with increased risk of illness ranging from lead toxicity, asbestosis, and Hantavirus to name a just a few.

A patient with a genetic propensity for lung cancer who also smokes cigarettes and has high radon levels in their home is increasing the odds of developing disease due to these combined genetic, behavioral, and environmental factors.

Patient-derived Data and the Diagnosis

In addition to the information now available through state-of-the-art medical testing, patient-derived information from wearables, biometrics, and direct-to-consumer health testing kits, presents patients and physicians alike with new opportunities and challenges.

Armed with newly discovered health data, patients may present it to their doctors with a request that it be included in their health record. Many patients expect an interpretation of that data when they visit their doctor and an explanation of what it means for their present (and future) healthcare.

Doctors can be overwhelmed when unfiltered information is thrown at them. Doctors are not prepared and research has yet to offer definitive support for interpretation of patient-derived data.

Studying hereditary traits can offer some insights from generation to generation. By delving into genomics of individual patients, we get a clearer picture into a person’s risk factor for a certain disease, but often this information provides no immediate solutions. Discovering a genetic indicator for Alzheimer’s, may reflect a higher propensity for the disease, but symptoms may be decades away, if they appear at all.

Pitfalls and Possibilities

There are many concerns about genomic data collection, one of which is whether policies can keep pace with patient privacy and the related ethical questions that inevitably ensue. These questions are consistently surfacing and there is no clear direction on the best course of action.

Clearer policies are needed to delineate who has access to a patient’s genetic records and whether third parties, such as health or life insurance companies, can deny coverage or care based on genomics.

In addition, one cannot ignore the psychological burden associated with knowing your “potential” for a disease, based solely on your genetic testing, when it may never come to fruition. Not to mention, its effect on planning for one’s future decisions relative to career, residence, and relationship commitments.

Even some physicians are reticent to undergo genetic testing for fear of who might gain access to the information and the consequences thereof.

Physicians face an additional conundrum in dealing with patient-supplied information: How to counsel patients when, in some cases, the task should be the responsibility of a community resources representative? In addition, patients who request that certain information not be included in their personal health record, present a problem for a physician justifying a test or a procedure to a payer.

The consumerization of healthcare and patient engagement strategies employed to deliver better outcomes are driving the healthcare industry to open conversations that elevate the level of care delivered to patients. In addition, physicians need to demand more direction and initiate more discussions on how to deal with the opportunities and challenges presented in the era of patient-derived and pharmacogenomics data.

While improving patient-physician communication should always be a priority, discussing how and when to use genetic and patient-derived information is still a work in progress.

Dr. Furst is Vice President Clinical Consultants at Elsevier Clinical Solutions.

Key Articles in Health IT from 2017 (Part 2 of 2)

Posted on January 4, 2018 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

The first part of this article set a general context for health IT in 2017 and started through the year with a review of interesting articles and studies. We’ll finish the review here.

A thoughtful article suggests a positive approach toward health care quality. The author stresses the value of organic change, although using data for accountability has value too.

An article extolling digital payments actually said more about the out-of-control complexity of the US reimbursement system. It may or not be coincidental that her article appeared one day after the CommonWell Health Alliance announced an API whose main purpose seems to be to facilitate payment and other data exchanges related to law and regulation.

A survey by KLAS asked health care providers what they want in connected apps. Most apps currently just display data from a health record.

A controlled study revived the concept of Health Information Exchanges as stand-alone institutions, examining the effects of emergency departments using one HIE in New York State.

In contrast to many leaders in the new Administration, Dr. Donald Rucker received positive comments upon acceding to the position of National Coordinator. More alarm was raised about the appointment of Scott Gottlieb as head of the FDA, but a later assessment gave him high marks for his first few months.

Before Dr. Gottlieb got there, the FDA was already loosening up. The 21st Century Cures Act instructed it to keep its hands off many health-related digital technologies. After kneecapping consumer access to genetic testing and then allowing it back into the ring in 2015, the FDA advanced consumer genetics another step this year with approval for 23andMe tests about risks for seven diseases. A close look at another DNA site’s privacy policy, meanwhile, warns that their use of data exploits loopholes in the laws and could end up hurting consumers. Another critique of the Genetic Information Nondiscrimination Act has been written by Dr. Deborah Peel of Patient Privacy Rights.

Little noticed was a bill authorizing the FDA to be more flexible in its regulation of digital apps. Shortly after, the FDA announced its principles for approving digital apps, stressing good software development practices over clinical trials.

No improvement has been seen in the regard clinicians have for electronic records. Subjective reports condemned the notorious number of clicks required. A study showed they spend as much time on computer work as they do seeing patients. Another study found the ratio to be even worse. Shoving the job onto scribes may introduce inaccuracies.

The time spent might actually pay off if the resulting data could generate new treatments, increase personalized care, and lower costs. But the analytics that are critical to these advances have stumbled in health care institutions, in large part because of the perennial barrier of interoperability. But analytics are showing scattered successes, being used to:

Deloitte published a guide to implementing health care analytics. And finally, a clarion signal that analytics in health care has arrived: WIRED covers it.

A government cybersecurity report warns that health technology will likely soon contribute to the stream of breaches in health care.

Dr. Joseph Kvedar identified fruitful areas for applying digital technology to clinical research.

The Government Accountability Office, terror of many US bureaucracies, cam out with a report criticizing the sloppiness of quality measures at the VA.

A report by leaders of the SMART platform listed barriers to interoperability and the use of analytics to change health care.

To improve the lower outcomes seen by marginalized communities, the NIH is recruiting people from those populations to trust the government with their health data. A policy analyst calls on digital health companies to diversify their staff as well. Google’s parent company, Alphabet, is also getting into the act.

Specific technologies

Digital apps are part of most modern health efforts, of course. A few articles focused on the apps themselves. One study found that digital apps can improve depression. Another found that an app can improve ADHD.

Lots of intriguing devices are being developed:

Remote monitoring and telehealth have also been in the news.

Natural language processing and voice interfaces are becoming a critical part of spreading health care:

Facial recognition is another potentially useful technology. It can replace passwords or devices to enable quick access to medical records.

Virtual reality and augmented reality seem to have some limited applications to health care. They are useful foremost in education, but also for pain management, physical therapy, and relaxation.

A number of articles hold out the tantalizing promise that interoperability headaches can be cured through blockchain, the newest hot application of cryptography. But one analysis warned that blockchain will be difficult and expensive to adopt.

3D printing can be used to produce models for training purposes as well as surgical tools and implants customized to the patient.

A number of other interesting companies in digital health can be found in a Fortune article.

We’ll end the year with a news item similar to one that began the article: serious good news about the ability of Accountable Care Organizations (ACOs) to save money. I would also like to mention three major articles of my own:

I hope this review of the year’s articles and studies in health IT has helped you recall key advances or challenges, and perhaps flagged some valuable topics for you to follow. 2018 will continue to be a year of adjustment to new reimbursement realities touched off by the tax bill, so health IT may once again languish somewhat.