Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

Let’s Keep Genetic Information an Individual Affair

Posted on November 28, 2018 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

These times train us to seek continually for more data and more transparency, always assuming that more is better. But some types of data and transparency bring risks, because “A little learning is a dangerous thing.” In particular, sharing genetic information with family members raises daunting ethical issues, along with the need for a mature understanding of consequences, as illustrated by a court case from the UK recently reported in The Guardian.

Superficially, this case seems to be a simple balancing act concerning how far a doctor is responsible to fulfill a family member’s right to know. But in context of our modern scientific knowledge about genetics–and the limitations of that knowledge–far greater concerns emerge.

In this case, a man in the latter stages of life was diagnosed with Huntington’s disease. His daughter was pregnant at the time. After the baby was born and the man died, his daughter sued the doctor because the doctor had failed to inform her of her father’s diagnosis. She claimed that she would have aborted the fetus had she known. She, in fact, was later diagnosed with the dreaded condition, meaning that her daughter has a 50% chance of developing the condition too.

The Guardian reports what I consider a proper and satisfactory resolution to the case–upholding the doctor’s right to suppress the information and denying the mother a victory–but the newspaper reports that the case was disturbing because it proceeded so far, leaving an opening for similar cases in the future. The details reported in the article about this case make the mother’s argument even weaker: it turns out that the doctor asked his patient whether he’d like his daughter to be notified. And the patient, knowing her intentions full well, told the doctor he wanted the baby to be born and insisted that his daughter not be told.

Huntington’s is certainly a heart-rending disease, both for the patient and the family. I can understand why a woman would want to prevent her child from suffering from Huntington’s–certainly a very difficult choice to make–but note that the fetus’s odds of getting the disease were only 50%, a facet of the discussion that will return as I examine the ethics of genetic counseling. The overweening point is that here we have a highly ethical doctor who quite properly left the decision in the hands of the patient and respected the patient’s privacy.

What are some of the subtler considerations about genetics that didn’t make it into The Guardian article?

Trouble setting thresholds

In our Huntington’s disease case, the mother declared that a 50% chance of contracting such a horrible condition would be enough to prevent her from having children, and even enough to drive her to abort the fetus she already had. But if we required doctors to report diseases to relatives, how serious should the risk be? Should they report a tendency to baldness or something else non-threatening? How threatening should the condition be? And how likely should it be? If the incidence of the disease is 1% in the general population, should they be forced to report a 2% chance? Or 20%? 50%? Such questions become even murkier when statistics are based on diagnosis of a family member instead of personal genetic testing.

Unknown variability

Many conditions are affected by other genes that individuals may or may not possess, along with lifestyle choices and other environmental factors. Statistics collected over a population of a few thousand patients may suggest that someone’s child has a 50% chance of inheriting a disease, but depending on one’s genetic make-up, the chance may be 5% for one person and 95% for another. Under such conditions, is it truthful and meaningful to tell a person he has a 50% risk?

Need for counseling

We’ve seen that it’s hard to draw meaningful conclusions from most genetic results, and even harder to chart a rational course of action. Furthermore, learning that one has heightened risk for any medical condition causes the eruption of strong emotions that must be handled in a professional setting. Suppose someone has Huntington’s disease and 50 family members are potentially affected. Does the doctor have a responsibility, not only to notify all 50 family members, but to provide counseling for them as well?

In short, we can’t drop crude regulations onto clinical staff in an environment of such ambiguity and variety. The best ethical guideline we have is the classic one: respecting the patient’s privacy. Each patient can determine whom to tell and how much to see to each person about his condition.

I think that genetic notification should be on a “pull,” not “push,” basis. A person who wants to check for genetic risks can be tested. An expectant mother can ask her parents to please let her know of any changes to their health that could have a genetic impact on her. Certain types of notification may require counseling to help a member of the general public understand her risks and options. Reasonable fees for all these things apply.

The problem of predicting risk goes way beyond genetic conditions. We know that insurers, financial institutions, and others comb through publicly available information about us in order to make major decisions affecting our lives, such as whether we can get a mortgage. Just as we need guidance to make health decisions, these institutions should be held to high standards for fairness and for respecting individual dignity. We’ve seen that a father has the right to withhold genetic information from his own daughter–how much more right do we all have to keep our sensitive conditions secret from commercial institutions. When data increasingly underlies our own decisions as well as decisions made about us by others, transparency has to vie with other ethical considerations.

Providers Tell KLAS That Existing EMRs Can’t Handle Genomic Medicine

Posted on November 26, 2018 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Providers are still in the early stages of applying genomics to patient care. However, at least among providers that can afford the investment, clinical genomics programs are beginning to become far more common, and as a result, we’re beginning to get a sense of what’s involved.

Apparently, one of those things might be creating a new IT infrastructure which bypasses the provider’s existing EMR to support genomics data management.

KLAS recently spoke with a number of providers about the vendors and technologies they were using to implement precision medicine. Along the way, they were able to gather some information on the best practices of the providers which can be used to roll out their own programs.

In its report, “Precision Medicine Provider Validations 2018,”  KLAS researchers assert that while precision medicine tools have become increasingly common in oncology settings, they can be useful in many other settings.

Which vendors they should consider depends on what their organization’s precision medicine objectives are, according to one VP interviewed by the research firm. “Organizations need to consider whether they want to target a specific area or expand the solutions holistically,” the VP said. “They [also] need to consider whether they will have transactional relationships with vendors or strategic partnerships.”

Another provider executive suggests that investing in specialty technology might be a good idea. “Precision medicine should really exist outside of EMRs,” one provider president/CEO told KLAS. “We should just use software that comes organically with precision medicine and then integrated with an EMR later.”

At the same time, however, don’t expect any vendor to offer you everything you need for precision medicine, a CMO advised. “We can’t build a one-size-fits-all solution because it becomes reduced to meaninglessness,” the CMO told KLAS. “A hospital CEO thinks about different things than an oncologist.”

Be prepared for a complicated data sharing and standardization process. “We are trying to standardize the genomics data on many different people in our organization so that we can speak a common language and archive data in a common system,” another CMO noted.

At the same time, though, make sure you gather plenty of clinical data with an eye to the future, suggests one clinical researcher. “There are always new drugs and new targets, and if we can’t test patients for them now, we won’t catch things later,” the researcher said.

Finally, and this will be a big surprise, brace yourself for massive data storage demands. “Every year, I have to go back to our IT group and tell them that I need another 400 terabytes,” one LIS manager told the research firm.” When we are starting to deal with 400 terabytes here and 400 terabytes there, we’re looking at potentially petabytes of storage after a very short period of time.”

If you’re like me, the suggestion that providers need to build a separate infrastructure outside the EMR to create precision medicine program is pretty surprising, but it seems to be the consensus that this is the case. Almost three-quarters of providers interviewed by KLAS said they don’t believe that their EMR will have a primary role in the future of precision medicine, with many suggesting that the EMR vendor won’t be viable going forward as a result.

I doubt that this will be an issue in the near term, as the barriers to creating a genomics program are high, especially the capital requirements. However, if I were Epic or Cerner, I’d take this warning seriously. While I doubt that every provider will manage their own genomics program directly, precision medicine will be part of all care at some point and is already having an influence on how a growing number of conditions are treated.

Scripps Research Translational Institute Partners To Develop AI Applications

Posted on November 2, 2018 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

The Scripps Research Translational Institute has agreed to work with graphics processing unit-maker NVIDIA to support the development of AI applications. The partners plan to forge AI and deep learning best practices, tools and infrastructure tailored to supporting the AI application development process.

In collaboration with NVIDIA, Scripps will establish a center of excellence for artificial intelligence in genomics and digital sensors. According to Dr. Eric Topol, the Institute’s founder and director, AI should eventually improve accuracy, efficiency, and workflow in medical practices. This is especially true of the data inputs from sensors and sequencing, he said in an NVIDIA blog item on the subject.

Scripps is already a member of a unique data-driven effort known as the “All of Us Research Program,” which is led by the National Institutes of Health. This program, which collects data on more than 1 million US participants, looks at the intersection of biology, genetics, environment, data science, and computation. If successful, this research will expand the range of conditions that can be treated using precision medicine techniques.

NVIDIA, for its part, is positioned to play an important part in the initial wave of AI application rollouts. The company is a leader in producing performance chipsets popular with those who play high-end, processor-intensive gaming which it has recently applied to other processor intensive projects like blockchain. It now hopes its technology will form the core of systems designed to crunch the high volumes of data used in AI projects.

If NVIDIA can provide hardware that makes high-volume number-crunching less expensive and more efficient, it could establish an early lead in what is likely to be a very lucrative market. Given its focus on graphics processing, the hardware giant could be especially well-suited to dominate rapidly-emerging radiology AI applications.

We can certainly expect to see more partnerships like this file into place over the next year or two. Few if any IT vendors have enough scientific expertise in-house to make important gains in biotech AI, and few providers have enough excess IT talent available to leverage discoveries and data in this arena.

It will be interesting to see what AI applications development approaches emerge from such partnerships. Right now, much AI development and integration is being done on a one-off basis, but it’s likely these projects will become more systematized soon.

Healthcare AI Could Generate $150B In Savings By 2025

Posted on September 27, 2018 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Is the buzz around healthcare AI solutions largely hype, or can they deliver measurable benefits? Lest you think it’s too soon to tell, check out the following.

According to a new report from market analyst firm Frost & Sullivan, AI and cognitive computing will generate $150 billion in savings for the healthcare business by 2025.  Frost researchers expect the total AI market to grow to $6.16 billion between 2018 and 2022.

The analyst firm estimates that at present, only 15% to 20% of payers, providers and pharmaceutical companies have been using AI actively to change healthcare delivery. However, its researchers seem to think that this will change rapidly over the next few years.

One of the most interesting applications for healthcare AI that Frost cites is the use of AI in precision medicine, an area which clearly has a tremendous upside potential for both patients and institutions.

In this scenario, the AI integrates a patient’s genomic, clinical, financial and behavioral data, then cross-references the data with the latest academic research evidence and regulatory guidelines. Ultimately, the AI would create personalized treatment pathways for high-risk, high-cost patient populations, according to Koustav Chatterjee, an industry analyst focused on transformational health.

In addition, researchers could use AI to expedite the process of clinical trial eligibility assessment and generate prophylaxis plans that suggest evidence-based drugs, Chatterjee suggests.

The report also lists several other AI-enabled solutions that might be worth implementing, including automated disease prediction, intuitive claims management and real-time supply chain management.

Frost predicts that the following will be particularly hot AI markets:

  • Using AI in imaging to drive differential diagnosis
  • Combining patient-generated data with academic research to generate personalized treatment possibilities
  • Performing clinical documentation improvement to reduce clinician and coder stress and reduce claims denials
  • Using AI-powered revenue cycle management platforms that auto-adjust claims content based on payer’s coding and reimbursement criteria

Now, it’s worth noting that it may be a while before any of these potential applications become practical.

As we’ve noted elsewhere, getting rolling with an AI solution is likely to be tougher than it sounds for a number of reasons.

For example, integrating AI-based functions with providers’ clinical processes could be tricky, and what’s more, clinicians certainly won’t be happy if such integration disrupts the EHR workflow already in existence.

Another problem is that you can’t deploy an AI-based solution without ”training” it on a cache of existing data. While this shouldn’t be an issue, in theory, the reality is that much of the data providers generate is still difficult to filter and mine.

Not only that, while AI might generate interesting and effective solutions to clinical problems, it may not be clear how it arrived at the solution. Physicians are unlikely to trust clinical ideas that come from a black box, e.g. an opaque system that doesn’t explain itself.

Don’t get me wrong, I’m a huge fan of healthcare AI and excited by its power. One can argue over which solutions are the most practical, and whether AI is the best possible tool to solve a given problem, but most health IT pros seem to believe that there’s a lot of potential here.

However, it’s still far from clear how healthcare AI applications will evolve. Let’s see where they turn up next and how that works out.

An Interesting Overview Of Alphabet’s Healthcare Investments

Posted on June 27, 2018 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Recently I’ve begun reading a blog called The Medical Futurist which offers some very interesting fare. In addition to some intriguing speculation, it includes some research that I haven’t seen anywhere else. (It is written by a physician named Bertalan Mesko.)

In this case, Mesko has buried a shrewd and well-researched piece on Alphabet’s healthcare investments in an otherwise rambling article. (The rambling part is actually pretty interesting on its own, by the way.)

The piece offers a rather comprehensive update on Alphabet’s investments in and partnerships with healthcare-related companies, suggesting that no other contender in Silicon Valley is investing in this sector heavily as Alphabet’s GV (formerly Google Ventures). I don’t know if he’s right about this, but it’s probably true.

By Mesko’s count, GV has backed almost 60 health-related enterprises since the fund was first kicked off in 2009. These investments include direct-to-consumer genetic testing firm 23andme, health insurance company Oscar Health, telemedicine venture Doctor on Demand and Flatiron Health, which is building an oncology-focused data platform.

Mesko also points out that GV has had an admirable track record so far, with five of the companies it first backed going public in the last year. I’m not sure I agree that going public is per se a sign of success — a lot depends on how the IPO is received by Wall Street– but I see his logic.

In addition, he notes that Alphabet is stocking up on intellectual resources. The article cites research by Ernest & Young reporting that Alphabet filed 186 healthcare-related patents between 2013 and 2017.

Most of these patents are related to DeepMind, which Google acquired in 2014, and Verily Life Sciences (formerly Google Life Sciences). While these deals are interesting in and of themselves, on a broader level the patents demonstrate Alphabet’s interest in treating chronic illnesses like diabetes and the use of bioelectronics, he says.

Meanwhile, Verily continues to work on a genetic data-collecting initiative known as the Baseline Study. It plans to leverage this data, using some of the same algorithms behind Google’s search technology, to pinpoint what makes people healthy.

It’s a grand and somewhat intimidating picture.

Obviously, there’s a lot more to discuss here, and even Mesko’s in-depth piece barely scratches the surface of what can come out of Alphabet and Google’s health investments. Regardless, it’s worth keeping track of their activity in the sector even if you find it overwhelming. You may be working for one of those companies someday.

Small Grounds for Celebration and Many Lurking Risks in HIMSS Survey

Posted on March 12, 2018 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

When trying to bypass the breathless enthusiasm of press releases and determine where health IT is really headed, we can benefit from a recent HIMMS survey, released around the time of their main annual conference. They managed to get responses from 224 managers of health care facilities–which range from hospitals and clinics to nursing homes–and 145 high-tech developers that fall into the large categories of “vendors” and “consultants.” What we learn is that vendors are preparing for major advances in health IT, but that clinicians are less ready for them.

On the positive side, both the clinicians and the vendors assign fairly high priority to data analytics and to human factors and design (page 7). In fact, data analytics have come to be much more appreciated by clinicians in the past year (page 9). This may reflect the astonishing successes of deep learning artificial intelligence reported recently in the general press, and herald a willingness to invest in these technologies to improve health care. As for human factors and design, the importance of these disciplines has been repeatedly shown in HxRefactored conferences.

Genomics ranks fairly low for both sides, which I think is reasonable given that there are still relatively few insights we can gain from genetics to change our treatments. Numerous studies have turned up disappointing results: genetic testing doesn’t work very well yet, and tends to lead only to temporary improvements. In fact, both clinicians and vendors show a big drop in interest in precision medicine and genetics (pages 9 and 10). The drop in precision medicine, in particular, may be related to the strong association the term has with Vice President Joe Biden in the previous administration, although NIH seems to still be committed to it. Everybody knows that these research efforts will sprout big payoffs someday–but probably not soon enough for the business models of most companies.

But much more of the HIMSS report is given over to disturbing perception gaps between the clinicians and vendors. For instance, clinicians hold patient safety in higher regard than vendors (page 7). I view this concern cynically. Privacy and safety have often been invoked to hold back data exchange. I cannot believe that vendors in the health care space treat patient safety or privacy carelessly. I think it more likely that clinicians are using it as a shield to hide their refusal to try valuable new technologies.

In turn, vendors are much more interested in data exchange and integration than clinicians (page 7). This may just reflect a different level of appreciation for the effects of technology on outcomes. That is, data exchange and integration may be complex and abstract concepts, so perhaps the vendors are in a better position to understand that it ultimately determines whether a patient gets the treatment her condition demands. But really, how difficult can it be to be to understand data exchange? It seems like the clinicians are undermining the path to better care through coordination.

I have trouble explaining the big drops in interest in care coordination and public health (pages 9 and 10), which is worrisome because these things will probably do more than anything to produce healthier populations. The problem, I think, is probably that there’s no reimbursement for taking on these big, hairy problems. HIMMS explains the drop as a shift of attention to data analytics, which should ultimately help achieve the broader goals (page 11).

HIMSS found that clinicians expect to decrease their investments in health IT over the upcoming year, or at least to keep the amount steady (page 14). I suspect this is because they realize they’ve been soaked by suppliers and vendors. Since Meaningful Use was instituted in 2009, clinicians have poured billions of dollars and countless staff time into new EHRs, reaping mostly revenue-threatening costs and physician burn-out. However, as HIMSS points out, vendors expect clinicians to increase their investments in health IT–and may be sorely disappointed, especially as they enter a robust hiring phase (page 15).

Reading the report, I come away feeling that the future of health care may be bright–but that the glow you see comes from far over the horizon.

UPMC Sells Oncology Analytics Firm To Elsevier

Posted on January 22, 2018 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

Using analytics tools to improve cancer treatment can be very hard. That struggle is exemplified by the problems faced by IBM Watson Health, which dove into the oncology analytics field a few years ago but made virtually no progress in improving cancer treatment.

With any luck, however, Via Oncology will be more successful at moving the needle in cancer care. The company, which offers decision support for cancer treatment and best practices in cancer care management, was just acquired by information analytics firm Elsevier, which plans to leverage the company’s technology to support its healthcare business.

Elsevier’s Clinical Solutions group works to improve patient outcomes, reduce clinical errors and optimize cost and reimbursements for providers. Via Oncology, a former subsidiary of the University of Pittsburgh Medical Center, develops and implements clinical pathways for cancer care. Via Oncology spent more than 15 years as part of UPMC prior to the acquisition.

Via Oncology’s Via Pathways tool relies on evidence-based content to create clinical algorithms covering 95% of cancer types treated in the US. The content was developed by oncologists. In addition to serving as a basis for algorithm development, Via Oncology also shares the content with physicians and their staff through its Via Portal, a decision support tool which integrates with provider EMRs.

According to Elsevier, Via Pathways addresses more than 2,000 unique patient presentations which can be addressed by clinical algorithms and recommendations for all major aspects of cancer care. The system can also offer nurse triage and symptom tracking, cost information analytics, quality reporting and medical home tools for cancer centers.

According to the prepared statement issued by Elsevier, UPMC will continue to be a Via Oncology customer, which makes it clear that the healthcare giant wasn’t dumping its subsidiary or selling it for a fire sale price.

That’s probably because in addition to UPMC, more than 1,500 oncology providers and community, hospital and academic settings hold Via Pathways licenses. What makes this model particularly neat is that these cancer centers are working collaboratively to improve the product as they use it. Too few specialty treatment professionals work together this effectively, so it’s good to see Via Oncology leveraging user knowledge this way.

While most of this seems clear, I was left with the question of what role, if any, genomics plays in Via Oncology’s strategy. While it may be working with such technologies behind the scenes, the company didn’t mention any such initiatives in its publicly-available information.

This approach seems to fly in the face of existing trends and in particular, physician expectations. For example, a recent survey of oncologists by medical publication Medscape found that 71% of respondents felt genomic testing was either very important or extremely important to their field.

However, Via Oncology may have something up its sleeve and is waiting for it to be mature before it dives into the genomics pool. We’ll just have to see what it does as part of Elsevier.

Are there other areas beyond cancer where a similar approach could be taken?

Doctors, Data, Diagnoses, and Discussions: Achieving Successful and Sustainable Personalized/Precision Medicine

Posted on January 10, 2018 I Written By

The following is a guest blog post by Drew Furst, M.D., Vice President Clinical Consultants at Elsevier Clinical Solutions.

Personalized/precision medicine is a growing field and that trend shows no sign of slowing down.

In fact, a 2016 Grand View Research report estimated the global personalized medicine market was worth $1,007.88 billion in 2014, with projected growth to reach $2,452.50 billion by 2022.

As these areas of medicine become more commonplace, understanding the interactions between biological factors with a range of personal, environmental and social impacts on health is a vital step towards achieving sustainable success.

A better understanding begins with answering important questions such as whether the focus should be precision population medicine (based on disease) or precision patient-specific medicine (based on the individual).

Specificity in terminology is needed. The traditional term of “personalized medicine” has evolved into the term “precision medicine,” but this new usage requires a more detailed look into the precise science of genetic, environmental and lifestyle factors that influence any approach to treatment.

Comprehending the interactions between biological factors with a range of personal, environmental, and social impacts on health can provide insights into success and we’ve learned that some areas of precision medicine are more effective than others.

Through pharmacogenomics – the study of understanding how a patient’s genetic make-up affects the response to a particular drug – we have identified key enzymes in cancer formation and cancer treatment, which aids in the customization of drugs.

Research shows us that drug-metabolizing enzyme activity is one of many factors that impact a patient’s response to medication. We also know that human cytochrome P450 (CYP) plays an important role in the metabolism of drugs and environmental chemicals.

Therapies that incorporate drug-specific pharmacogenomics are a boon to oncology treatments and a vast improvement over the “shotgun therapy” approach of the past. Today, treatments can be targeted to enzymes and receptors that vary from person to person.

In traditional chemotherapy, a drug developed to kill rapidly growing cancer cells will indiscriminately target other rapidly growing cells such as hair cells, hence the often-observed hair loss. However, a targeted drug and delivery method aimed at only the receptive cells can be a much more effective approach and treatment, while minimizing collateral damage.

Recently, the journal Nature published a study showing the promise this method holds.  In the pilot study, scientists led by Dr. Catherine Wu of Dana-Farber Cancer Institute in Boston gave six melanoma patients an experimental, custom-made vaccine and, two years later, all were tumor-free following treatment.

Looking Beyond Genetics

Precision medicine needs to include more than just genetics.

Factors such as environment and socio-economic status also must be included when approaching disease states and we must undertake a comprehensive overview of a patient’s situation, including, but not limited to, family history.

Cultural dietary traditions can play into disease susceptibility. As an example, the frequent consumption of smoked fish in some Asian cultures increases their risk of gastric (stomach) cancers. Lower socioeconomic status can force acceptance of substandard and overcrowded housing with increased risk of illness ranging from lead toxicity, asbestosis, and Hantavirus to name a just a few.

A patient with a genetic propensity for lung cancer who also smokes cigarettes and has high radon levels in their home is increasing the odds of developing disease due to these combined genetic, behavioral, and environmental factors.

Patient-derived Data and the Diagnosis

In addition to the information now available through state-of-the-art medical testing, patient-derived information from wearables, biometrics, and direct-to-consumer health testing kits, presents patients and physicians alike with new opportunities and challenges.

Armed with newly discovered health data, patients may present it to their doctors with a request that it be included in their health record. Many patients expect an interpretation of that data when they visit their doctor and an explanation of what it means for their present (and future) healthcare.

Doctors can be overwhelmed when unfiltered information is thrown at them. Doctors are not prepared and research has yet to offer definitive support for interpretation of patient-derived data.

Studying hereditary traits can offer some insights from generation to generation. By delving into genomics of individual patients, we get a clearer picture into a person’s risk factor for a certain disease, but often this information provides no immediate solutions. Discovering a genetic indicator for Alzheimer’s, may reflect a higher propensity for the disease, but symptoms may be decades away, if they appear at all.

Pitfalls and Possibilities

There are many concerns about genomic data collection, one of which is whether policies can keep pace with patient privacy and the related ethical questions that inevitably ensue. These questions are consistently surfacing and there is no clear direction on the best course of action.

Clearer policies are needed to delineate who has access to a patient’s genetic records and whether third parties, such as health or life insurance companies, can deny coverage or care based on genomics.

In addition, one cannot ignore the psychological burden associated with knowing your “potential” for a disease, based solely on your genetic testing, when it may never come to fruition. Not to mention, its effect on planning for one’s future decisions relative to career, residence, and relationship commitments.

Even some physicians are reticent to undergo genetic testing for fear of who might gain access to the information and the consequences thereof.

Physicians face an additional conundrum in dealing with patient-supplied information: How to counsel patients when, in some cases, the task should be the responsibility of a community resources representative? In addition, patients who request that certain information not be included in their personal health record, present a problem for a physician justifying a test or a procedure to a payer.

The consumerization of healthcare and patient engagement strategies employed to deliver better outcomes are driving the healthcare industry to open conversations that elevate the level of care delivered to patients. In addition, physicians need to demand more direction and initiate more discussions on how to deal with the opportunities and challenges presented in the era of patient-derived and pharmacogenomics data.

While improving patient-physician communication should always be a priority, discussing how and when to use genetic and patient-derived information is still a work in progress.

Dr. Furst is Vice President Clinical Consultants at Elsevier Clinical Solutions.

Key Articles in Health IT from 2017 (Part 2 of 2)

Posted on January 4, 2018 I Written By

Andy Oram is an editor at O'Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space. Andy also writes often for O'Reilly's Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

The first part of this article set a general context for health IT in 2017 and started through the year with a review of interesting articles and studies. We’ll finish the review here.

A thoughtful article suggests a positive approach toward health care quality. The author stresses the value of organic change, although using data for accountability has value too.

An article extolling digital payments actually said more about the out-of-control complexity of the US reimbursement system. It may or not be coincidental that her article appeared one day after the CommonWell Health Alliance announced an API whose main purpose seems to be to facilitate payment and other data exchanges related to law and regulation.

A survey by KLAS asked health care providers what they want in connected apps. Most apps currently just display data from a health record.

A controlled study revived the concept of Health Information Exchanges as stand-alone institutions, examining the effects of emergency departments using one HIE in New York State.

In contrast to many leaders in the new Administration, Dr. Donald Rucker received positive comments upon acceding to the position of National Coordinator. More alarm was raised about the appointment of Scott Gottlieb as head of the FDA, but a later assessment gave him high marks for his first few months.

Before Dr. Gottlieb got there, the FDA was already loosening up. The 21st Century Cures Act instructed it to keep its hands off many health-related digital technologies. After kneecapping consumer access to genetic testing and then allowing it back into the ring in 2015, the FDA advanced consumer genetics another step this year with approval for 23andMe tests about risks for seven diseases. A close look at another DNA site’s privacy policy, meanwhile, warns that their use of data exploits loopholes in the laws and could end up hurting consumers. Another critique of the Genetic Information Nondiscrimination Act has been written by Dr. Deborah Peel of Patient Privacy Rights.

Little noticed was a bill authorizing the FDA to be more flexible in its regulation of digital apps. Shortly after, the FDA announced its principles for approving digital apps, stressing good software development practices over clinical trials.

No improvement has been seen in the regard clinicians have for electronic records. Subjective reports condemned the notorious number of clicks required. A study showed they spend as much time on computer work as they do seeing patients. Another study found the ratio to be even worse. Shoving the job onto scribes may introduce inaccuracies.

The time spent might actually pay off if the resulting data could generate new treatments, increase personalized care, and lower costs. But the analytics that are critical to these advances have stumbled in health care institutions, in large part because of the perennial barrier of interoperability. But analytics are showing scattered successes, being used to:

Deloitte published a guide to implementing health care analytics. And finally, a clarion signal that analytics in health care has arrived: WIRED covers it.

A government cybersecurity report warns that health technology will likely soon contribute to the stream of breaches in health care.

Dr. Joseph Kvedar identified fruitful areas for applying digital technology to clinical research.

The Government Accountability Office, terror of many US bureaucracies, cam out with a report criticizing the sloppiness of quality measures at the VA.

A report by leaders of the SMART platform listed barriers to interoperability and the use of analytics to change health care.

To improve the lower outcomes seen by marginalized communities, the NIH is recruiting people from those populations to trust the government with their health data. A policy analyst calls on digital health companies to diversify their staff as well. Google’s parent company, Alphabet, is also getting into the act.

Specific technologies

Digital apps are part of most modern health efforts, of course. A few articles focused on the apps themselves. One study found that digital apps can improve depression. Another found that an app can improve ADHD.

Lots of intriguing devices are being developed:

Remote monitoring and telehealth have also been in the news.

Natural language processing and voice interfaces are becoming a critical part of spreading health care:

Facial recognition is another potentially useful technology. It can replace passwords or devices to enable quick access to medical records.

Virtual reality and augmented reality seem to have some limited applications to health care. They are useful foremost in education, but also for pain management, physical therapy, and relaxation.

A number of articles hold out the tantalizing promise that interoperability headaches can be cured through blockchain, the newest hot application of cryptography. But one analysis warned that blockchain will be difficult and expensive to adopt.

3D printing can be used to produce models for training purposes as well as surgical tools and implants customized to the patient.

A number of other interesting companies in digital health can be found in a Fortune article.

We’ll end the year with a news item similar to one that began the article: serious good news about the ability of Accountable Care Organizations (ACOs) to save money. I would also like to mention three major articles of my own:

I hope this review of the year’s articles and studies in health IT has helped you recall key advances or challenges, and perhaps flagged some valuable topics for you to follow. 2018 will continue to be a year of adjustment to new reimbursement realities touched off by the tax bill, so health IT may once again languish somewhat.

MindCrowd Memory Test

Posted on April 19, 2017 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

This week I was the moderator at the DellEMC #TranformHIT Healthcare Think Tank event. It was a great event and if you missed it, you can search the #TranformHIT on Twitter or find the recording in the embedded video at the bottom of this post.

One of the highlights of the event for me was meeting Dr. Jeff Trent From TGen, a nonprofit institute focused on translating genomic research into life-changing results. The work they’re doing is really quite incredible and Dr. Trent offered some great insights at the Think Tank.

One of the research projects at TGen is called Mind Crowd. This research looks at memory and other brain related diseases. As part of the study, they’re trying to get 1 million people to participate in a fun, but simple mind test on their site. The test takes about 10 minutes, but try it out and see how you do.

What’s fascinating about the results they’ve already seen from the 74k+ people who have taken the test to date is that women of all ages actually have better memory than men. There are outliers, but across the data it’s very clear that in this test women remember things better than men.

To add to these findings, there’s also an interesting thing that happens when women approach the age of menopause. Women at that age seem to actually get an increase in their memory. It’s not clear why this is the case, but the data shows an uptick in memory about the time most women hit menopause.

Tgen is also taking the outliers and working with them to study why their memory is so much better or worse (ie. an older person with an incredible memory or a younger person with a poor memory). I’m interested to see what comes from these studies.

If you want to contribute to their research, take 10 minutes and go and participate in their Mind Test.
Read more..