Free EMR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to EMR and HIPAA for FREE!!

More About Artificial Intelligence in Healthcare – #HITsm Chat Topic

Posted on August 8, 2017 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

We’re excited to share the topic and questions for this week’s #HITsm chat happening Friday, 8/11 at Noon ET (9 AM PT). This week’s chat will be hosted by Prashant Natarajan (@natarpr) on the topic of “More About Artificial Intelligence in Healthcare.” Be sure to also check out Prashant’s HIMSS best selling book Demystifying Big Data and Machine Learning for Healthcare to learn about his perspectives and insights into the topic.

Healthcare transformation requires us to continually look at new and better ways to manage insights – both within and outside the organization today. Increasingly, the ability to glean and operationalize new insights efficiently as a byproduct of an organization’s day-to-day operations is becoming vital to hospitals and health systems ability to survive and prosper. One of the long-standing challenges in healthcare informatics has been the ability to deal with the sheer variety and volume of disparate healthcare data and the increasing need to derive veracity and value out of it.

The potential for big data in healthcare – especially given the trends discussed earlier is as bright as any other industry. The benefits that big data analytics, AI, and machine learning can provide for healthier patients, happier providers, and cost-effective care are real. The future of precision medicine, population health management, clinical research, and financial performance will include an increased role for machine-analyzed insights, discoveries, and all-encompassing analytics.

This chat explores participants thoughts and feelings about the future of artificial intelligence in the healthcare industry and how healthcare organizations might leverage artificial intelligence to discover new business value, use cases, and knowledge.

Note: For purpose of this chat, “artificial intelligence” can mean predictive analytics, machine learning, big data analytics, natural language processing and contextually intelligent agents.

Reference Materials

Questions we will explore in this week’s #HITsm chat include:
T1: What words or short phrases convey your current thoughts & feelings about ‘artificial intelligence’ in the healthcare space? #HITsm #AI

T2: What are big & small steps healthcare can take to leverage big data & machine learning for population health & personalized care? #HITsm

T3: Which areas of healthcare might be most positively impacted by artificial intelligence? #HITsm #AI

T4: What are some areas within healthcare that will likely NOT be improved or replaced by artificial intelligence? #HITsm #AI

T5: What lessons learned from early days of ‘advanced analytics’ must not be forgotten as use of artificial intelligence expands? #HITsm #AI

Bonus: How is your organization preparing for the application and use of artificial intelligence in healthcare? #HITsm #AI

Upcoming #HITsm Chat Schedule
8/18 – Diversity in HIT
Hosted by Jeanmarie Loria (@JeanmarieLoria) from @advizehealth

8/25 – Consumer Data Liquidity – The Road So Far, The Road Ahead
Hosted by Greg Meyer (@Greg_Meyer93)

We look forward to learning from the #HITsm community! As always, let us know if you’d like to host a future #HITsm chat or if you know someone you think we should invite to host.

If you’re searching for the latest #HITsm chat, you can always find the latest #HITsm chat and schedule of chats here.

Scenarios for Health Care Reform (Part 2 of 2)

Posted on May 18, 2017 I Written By

Andy Oram is an editor at O’Reilly Media, a highly respected book publisher and technology information provider. An employee of the company since 1992, Andy currently specializes in open source, software engineering, and health IT, but his editorial output has ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. His articles have appeared often on EMR & EHR and other blogs in the health IT space.

Andy also writes often for O’Reilly’s Radar site (http://oreilly.com/) and other publications on policy issues related to the Internet and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O’Reilly’s Open Source Convention, FISL (Brazil), FOSDEM, and DebConf.

The first part of this article suggested two scenarios that could promote health care reform. We’ll finish off the scenarios in this part of the article.

Capitalism Disrupts Health Care

In the third scenario, reform is stimulated by an intrepid data science firm that takes on health care with greater success than most of its predecessors. After assembling an impressive analytics toolkit from open source software components–thus simplifying licensing–it approaches health care providers and offers them a deal they can’t refuse: analytics demonstrated to save them money and support their growth, all delivered for free. The data science firm asks in return only that they let it use deidentified data from their patients and practices to build an enhanced service that it will offer paying customers.

Some health care providers balk at the requirement to share data, but their legal and marketing teams explain that they have been doing it for years already with companies whose motives are less commendable. Increasingly, the providers are won over. The analytics service appeals particularly to small, rural, and safety-net providers. Hammered by payment cuts and growing needs among their populations, they are on the edge of going out of business and grasp the service as their last chance to stay in the black.

Participating in the program requires the extraction of data from electronic health records, and some EHR vendors try to stand in the way in order to protect their own monopoly on the data. Some even point to clauses in their licenses that prohibit the sharing. But they get a rude message in return: so valuable are the analytics that the providers are ready to jettison the vendors in a minute. The vendors ultimately go along and even compete on the basis of their ability to connect to the analytics.

Once stability and survival are established, the providers can use the analytics for more and more sophisticated benefits. Unlike the inadequate quality measures currently in use, the analytics provide a robust framework for assessing risk, stratifying populations, and determining how much a provider should be rewarded for treating each patient. Fee-for-outcome becomes standard.

Providers make deals to sign up patients for long-term relationships. Unlike the weak Medicare ACO model, which punishes a provider for things their patients do outside their relationship, the emerging system requires a commitment from the patient to stick with a provider. However, if the patient can demonstrate that she was neglected or failed to receive standard of care, she can switch to another provider and even require the misbehaving provider to cover costs. To hold up their end of this deal, providers find it necessary to reveal their practices and prices. Physician organizations develop quality-measurement platforms such as the recent PRIME registry in family medicine. A race to the top ensues.

What If Nothing Changes?

I’ll finish this upbeat article with a fourth scenario in which we muddle along as we have for years.

The ONC and Centers for Medicare & Medicaid Services continue to swat at waste in the health care system by pushing accountable care. But their ratings penalize safety-net providers, and payments fail to correlate with costs as hoped.

Fee-for-outcome flounders, so health care costs continue to rise to intolerable levels. Already, in Massachusetts, the US state that leads in universal health coverage, 40% of the state budget goes to Medicaid, where likely federal cuts will make it impossible to keep up coverage. Many other states and countries are witnessing the same pattern of rising costs.

The same pressures ride like a tidal wave through the rest of the health care system. Private insurers continue to withdraw from markets or lose money by staying. So either explicitly or through complex and inscrutable regulatory changes, the government allows insurers to cut sick people from their rolls and raise the cost burdens on patients and their employers. As patient rolls shrink, more hospitals close. Political rancor grows as the public watches employer money go into their health insurance instead of wages, and more of their own stagnant incomes go to health care costs, and government budgets tied up in health care instead of education and other social benefits.

Chronic diseases creep through the population, mocking crippled efforts at public health. Rampant obesity among children leads to more and earlier diabetes. Dementia also rises as the population ages, and climate change scatters its effects across all demographics.

Furthermore, when patients realize the costs they must take on to ask for health care, they delay doctor visits until their symptoms are unbearable. More people become disabled or perish, with negative impacts that spread through the economy. Output decline and more families become trapped in poverty. Self-medication for pain and mental illness becomes more popular, with predictable impacts on the opiate addiction crisis. Even our security is affected: the military finds it hard to recruit find healthy soldiers, and our foreign policy depends increasingly on drone strikes that kill civilians and inflame negative attitudes toward the US.

I think that, after considering this scenario, most of us would prefer one of the previous three I laid out in this article. If health care continues to be a major political issue for the next election, experts should try to direct discussion away from the current unproductive rhetoric toward advocacy for solutions. Some who read this article will hopefully feel impelled to apply themselves to one of the positive scenarios and bring it to fruition.

Precision Health 101: Understanding the Keys to Value – #HITsm Chat Topic

Posted on May 2, 2017 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

We’re excited to share the topic and questions for this week’s #HITsm chat happening Friday, 5/5 at Noon ET (9 AM PT). This week’s chat will be hosted by Bob Rogers (@ScientistBob) from @IntelHealth on the topic of “Precision Health 101: Understanding the Keys to Value”.

Precision health starts with personalized diagnosis and precision treatment planning. That’s why leading edge health systems are making health more precise at the individual patient level. This approach is a dramatic shift from the “one-size-fits-all” model of treatment and has shifted the conversation to the uniqueness of each person, down to “DNA fingerprint” that is fueling an individual’s disease.

In this Twitter chat, join the discussion with Bob Rogers, chief data scientist at Intel Corporation, about the definition of precision health, where it’s being utilized today, and what healthcare CIOs should be thinking about now to make personalized care a reality.

Join us on Friday May 5th at 12:00pm ET as we discuss the following questions on #HITsm:

The Questions
T1: What is your definition of precision health? #HITsm

T2: Where are you seeing precision health thriving? #HITsm

T3: What’s holding back our efforts in precision health? What changes need to be made? #HITsm

T4: What should an organization be doing to prepare for and participate in precision health?  #HITsm

T5: What benefits would a patient see from precision health? #HITsm

Bonus: How will data and analytics impact precision health? #HITsm

Upcoming #HITsm Chat Schedule
5/12 – Accelerating Decision-Making in Healthcare: How Health Systems Choose Innovative Decisions
Hosted by Bruce Brandes from Lucro Solutions

5/19 – Patient Education Using Healthcare Social Media
Hosted by Anne Zieger (@annezieger)

5/26 – TBD
Hosted by Chad Johnson (@OchoTex)

We look forward to learning from the #HITsm community! As always let us know if you have ideas for how to make #HITsm better.

If you’re searching for the latest #HITsm chat, you can always find the latest #HITsm chat and schedule of chats here.

#TransformHIT Think Tank Hosted by DellEMC

Posted on April 5, 2017 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.


DellEMC has once again invited me back to participate at the 6th annual #TransformHIT Healthcare Think Tank event happening Tuesday, April 18, 2017 from Noon ET (9 AM PT) – 3 PM ET (Noon PT). I think I’ve been lucky enough to participate 5 of the 6 years and I’ve really enjoyed every one of them. DellEMC does a great job bringing together really smart, interesting people and encourages a sincere, open discussion of major healthcare IT topics. Plus, they do a great job making it so everyone can participate, watch, and share virtually as well.

This year they asked me to moderate the Think Tank which will be a fun new adventure for me, but my job will be made easy by this exceptional list of people that will be participating:

  • John Lynn (@techguy)
  • Paul Sonnier (@Paul_Sonnier)
  • Linda Stotsky (@EMRAnswers)
  • Joe Babaian (@JoeBabaian)
  • Dr. Joe Kim (@DrJosephKim)
  • Andy DeLaO (@cancergeek)
  • Dan Munro (@danmunro)
  • Dr. Jeff Trent (@TGen)
  • Shahid Shah (@ShahidNShah)
  • Dave Dimond(@NextGenHIT)
  • Mike Feibus (@MikeFeibus)

This panel is going to take on three hot topics in the healthcare industry today:

  • Consumerism in Healthcare
  • Precision Medicine
  • Big Data and AI in Healthcare

The great thing is that you can watch the whole #TransformHIT Think Tank event remotely on Livestream (recording will be available after as well). We’ll be watching the #TransformHIT tweet stream and messages to @DellEMCHealth during the event as well if you want to ask any questions or share any insights. We’ll do our best to add outside people’s comments and questions into the discussion. The Think Tank is being held in Phoenix, AZ, so if you’re local there are a few audience seats available if you’d like to come watch live and meet any of the panelists in person. Just let me know in the comments or on our contact us page and I can give you more details.

If you have an interest in healthcare consumerism, precision medicine, or big data and AI in healthcare, then please join us on Tuesday, April 18, 2017 from Noon ET (9 AM PT) – 3 PM ET (Noon PT) for the live stream. It’s sure to be a lively and interesting discussion.
Read more..

Healthcare CIOs Focus On Optimizing EMRs

Posted on March 30, 2017 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she’s served as editor in chief of several healthcare B2B sites.

Few technical managers struggle with more competing priorities than healthcare CIOs. But according to a recent survey, they’re pretty clear what they have to accomplish over the next few years, and optimizing EMRs has leapt to the top of the to-do list.

The survey, which was conducted by consulting firm KPMG in collaboration with CHIME, found that 38 percent of CHIME members surveyed saw EMR optimization as their #1 priority for capital investment over the next three years.  To gather results, KPMG surveyed 122 CHIME members about their IT investment plans.

In addition to EMR optimization, top investment priorities identified by the respondents included accountable care/population health technology (21 percent), consumer/clinical and operational analytics (16 percent), virtual/telehealth technology enhancements (13 percent), revenue cycle systems/replacement (7 percent) and ERP systems/replacement (6 percent).

Meanwhile, respondents said that improving business and clinical processes was their biggest challenge, followed by improving operating efficiency and providing business intelligence and analytics.

It looks like at least some of the CIOs might have the money to invest, as well. Thirty-six percent said they expected to see an increase in their operating budget over the next two years, and 18 percent of respondents reported that they expect higher spending over the next 12 months. On the other hand, 63 percent of respondents said that spending was likely to be flat over the next 12 months and 44 percent over the next two years. So we have to assume that they’ll have a harder time meeting their goals.

When it came to infrastructure, about one-quarter of respondents said that their organizations were implementing or investing in cloud computing-related technology, including servers, storage and data centers, while 18 percent were spending on ERP solutions. In addition, 10 percent of respondents planned to implement cloud-based EMRs, 10 percent enterprise systems, and 8 percent disaster recovery.

The respondents cited data loss/privacy, poorly-optimized applications and integration with existing architecture as their biggest challenges and concerns when it came to leveraging the cloud.

What’s interesting about this data is that none of the respondents mentioned improved security as a priority for their organization, despite the many vulnerabilities healthcare organizations have faced in recent times.  Their responses are especially curious given that a survey published only a few months ago put security at the top of CIOs’ list of business goals for near future.

The study, which was sponsored by clinical communications vendor Spok, surveyed more than 100 CIOs who were CHIME members  — in other words, the same population the KPMG research tapped. The survey found that 81 percent of respondents named strengthening data security as their top business goal for the next 18 months.

Of course, people tend to respond to surveys in the manner prescribed by the questions, and the Spok questions were presumably worded differently than the KPMG questions. Nonetheless, it’s surprising to me that data security concerns didn’t emerge in the KPMG research. Bottom line, if CIOs aren’t thinking about security alongside their other priorities, it could be a problem.

What Do You Think Of Data Lakes?

Posted on October 4, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she’s served as editor in chief of several healthcare B2B sites.

Being that I am not a high-end technologist, I’m not always up on the latest trends in database management – so the following may not be news to everyone who reads this. As for me, though, the notion of a “data lake” is a new one, and I think it a valuable idea which could hold a lot of promise for managing unruly healthcare data.

The following is a definition of the term appearing on a site called KDnuggets which focuses on data mining, analytics, big data and data science:

A data lake is a storage repository that holds a vast amount of raw data in its native format, including structured, semi-structured and unstructured data. The data structure and requirements are not defined until the data is needed.

According to article author Tamara Dull, while a data warehouse contains data which is structured and processed, expensive to store, relies on a fixed configuration and used by business professionals, a data link contains everything from raw to structured data, is designed for low-cost storage (made possible largely because it relies on open source software Hadoop which can be installed on cheaper commodity hardware), can be configured and reconfigured as needed and is typically used by data scientists. It’s no secret where she comes down as to which model is more exciting.

Perhaps the only downside she identifies as an issue with data lakes is that security may still be a concern, at least when compared to data warehouses. “Data warehouse technologies have been around for decades,” Dull notes. “Thus, the ability to secure data in a data warehouse is much more mature than securing data in a data lake.” But this issue is likely to receive in the near future, as the big data industry is focused tightly on security of late, and to her it’s not a question of if security will mature but when.

It doesn’t take much to envision how the data lake model might benefit healthcare organizations. After all, it may make sense to collect data for which we don’t yet have a well-developed idea of its use. Wearables data comes to mind, as does video from telemedicine consults, but there are probably many other examples you could supply.

On the other hand, one could always counter that there’s not much value in storing data for which you don’t have an immediate use, and which isn’t structured for handy analysis by business analysts on the fly. So even if data lake technology is less costly than data warehousing, it may or may not be worth the investment.

For what it’s worth, I’d come down on the side of the data-lake boosters. Given the growing volume of heterogenous data being generated by healthcare organizations, it’s worth asking whether deploying a healthcare data lake makes sense. With a data lake in place, healthcare leaders can at least catalog and store large volumes of un-normalized data, and that’s probably a good thing. After all, it seems inevitable that we will have to wring value out of such data at some point.

The Value of Machine Learning in Value-based Care

Posted on August 4, 2016 I Written By

The following is a guest blog post by Mary Hardy, Vice President of Healthcare for Ayasdi.

Variation is a natural element in most healthcare delivery. After all, every patient is unique. But unwarranted clinical variation—the kind that results from a lack of systems and collaboration or the inappropriate use of care and services—is another issue altogether.

Healthcare industry thought leaders have called for the reduction of such unwarranted variation as the key to improving the quality and decreasing the cost of care. They have declared, quite rightly, that the quality of care an individual receives should not depend on geography. In response, hospitals throughout the United States are taking on the significant challenge of understanding and managing this variation.

Most hospitals recognize that the ability to distill the right insights from patient data is the catalyst for eliminating unwarranted clinical variation and is essential to implementing care models based on value. However, the complexity of patient data—a complexity that will only increase with the impending onslaught of data from biometric and personal fitness devices—can be overwhelming to even the most advanced organizations. There aren’t enough data scientists or analysts to make sense of the exponentially growing data sets within each organization.

Enter machine learning. Machine learning applications combine algorithms from computational biology and other disciplines to find patterns within billions of data points. The power of these algorithms enables organizations to uncover the evidence-based insights required for success in the value-based care environment.

Machine Learning and the Evolutionary Leap in Clinical Pathway Development
Since the 1990s, provider organizations have attempted to curb unwarranted variation by developing clinical pathways. A multi-disciplinary team of providers use peer-reviewed literature and patient population data to develop and validate best-practice protocols and guidance for specific conditions, treatments, and outcomes.

However, the process is burdened by significant limitations. Pathways often require months or years to research, build, and validate. Additionally, today’s clinical pathways are typically one-size-fits-all. Health systems that have the resources to do so often employ their own experts, who review research, pull data, run tables and come to a consensus on the ideal clinical pathway, but are still constrained by the experts’ inability to make sense of billions of data points.

Additionally, once the clinical pathway has been established, hospitals have few resources for tracking the care team’s adherence to the agreed-upon protocol. This alone is enough to derail years of efforts to reduce unwarranted variation.

Machine learning is the evolutionary leap in clinical pathway development and adherence. Acceleration is certainly a positive. High-performance machines and algorithms can examine complex continuously growing data elements far faster and capture insights more comprehensively than traditional or homegrown analytics tools. (Imagine reducing the development of a clinical pathway from months or years to weeks or days.)

But the true value of machine learning is enabling provider organizations to leverage patient population data from their own systems of record to develop clinical pathways that are customized to the organization’s processes, demographics, and clinicians.

Additionally, machine learning applications empower organizations to precisely track care team adherence, improving communication and organization effectiveness. By guiding clinicians to follow best practices through each step of care delivery, clinical pathways that are rooted in machine learning ensure that all patients receive the same level of high-quality care at the lowest possible cost.

Machine Learning Proves its Value
St. Louis-based Mercy, one of the most innovative health systems in the world, used a machine-learning application to recreate and improve upon a clinical pathway for total knee replacement surgery.

Drawing from Mercy’s integrated electronic medical record (EMR), the application grouped data from a highly complex series of events related to the procedure and segmented it. It was then possible to adapt other methods from biology and signals processing to the problem of determining the optimal way to perform the procedure—which drugs, tests, implants and other processes contribute to that optimal outcome. It also was possible to link predictive machine learning methods like regression or classification to perform real-time pathway editing.

The application revealed that Mercy’s patients naturally divided into clusters or groups with similar outcomes. The primary metric of interest to Mercy as an indicator of high quality was length of stay (LOS). The system highlighted clusters of patients with the shortest LOS and quickly discerned what distinguished this cluster from patients with the longest LOS.

What this analysis revealed was an unforeseen and groundbreaking care pathway for high-quality total knee replacement. The common denominator between all patients with the shortest LOS and best outcomes was administration of pregabalin—a drug generally prescribed for shingles. A group of four physicians had seen something in the medical literature that led them to believe that administering the drug prior to surgery would inhibit postoperative pain, reduce opiate usage and produce faster ambulation. It did.

This innovation was happening in Mercy’s own backyard, and it was undeniably a best practice—the data revealed that each of the best outcomes included administration of this drug. Using traditional approaches, it is highly unlikely that Mercy would have asked the question, “What if we use a shingles drug to improve total knee replacement?” The superior outcomes of four physicians would have remained hidden in a sea of complex data.

This single procedure was worth over $1 million per year for Mercy in direct costs.

What Mercy’s experience demonstrates is that the most difficult, persistent and complex problems in healthcare can resolve themselves through data. The key lies in having the right tools to navigate that data’s complexity. The ability to determine at a glance what differentiates good outcomes from bad outcomes is incredibly powerful—and will transform care delivery.

Mary Hardy is the Vice President of Healthcare for Ayasdi, a developer of machine intelligent applications for health systems and payer organizations.

Applying Geospatial Analysis to Population Health

Posted on June 28, 2016 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

This post is sponsored by Samsung Business. All thoughts and opinions are my own.

Megan Williams wrote a very interesting piece called “Geospatial Analysis: The Next Era of Population Health” in which she highlighted Kaiser’s efforts to use geospatial analysis as part of their population health efforts. Here’s her description of their project:

This means using data to inform policy adjustments and create intervention programs that lead to meaningful change. One of the best examples of this lies with healthcare giant Kaiser Permanente. In April, they launched a database that gave researchers the ability to examine patient DNA and bump it against behavioral and environmental health factors. The goal of the project is to pull information from half a million patients and use it to build one of the most “diverse repositories of environmental, genetic and health data in the world,” which could then be used to inform research around conditions including diabetes and cancer and their relationships to issues including localized violence, pollution, access to quality food and other factors.

This type of effort from Kaiser is quite incredible and I believe will truly be part of the way we shift the cost curve on healthcare costs. One challenge to this effort is that Kaiser has a very different business model than the rest of the healthcare system. They’re in a unique position where their business benefits from these types of population health efforts. Plus, Kaiser is very geographically oriented.

While Kaiser’s business model is currently very different, one could argue that the rest of healthcare is moving towards the Kaiser model. The shift to value based care and accountable care organizations is going to require the same geospatial analysis that Kaiser is building out today. Plus, hospital consolidation is providing real geographic dominance that wasn’t previously available. Will these shifting reimbursement models motivate all of the healthcare systems to care about the 99% of time patients spend outside of our care? I think they will and large healthcare organizations won’t have any choice in the matter.

There are a number of publicly and privately available data stores that are going to help in the geospatial analysis of a population’s health, but I don’t believe that’s going to be enough. In order to discover the real golden insights into a population, we’re going to have to look at the crossroads of data stores (behavioral, environmental, genomic, etc) combined together with personal health data. Some of that personal health data will come from things like EHR software, but I believe that the most powerful geospatial personal health data is going to come from an individual’s cell phone.

This isn’t a hard vision to see. Most of us now carry around a cell phone that knows a lot more about our health than we realize. Plus, it has a GPS where all of those actions can be plotted geospatially. Combine this personally collected health data with these large data stores and we’re likely to get a dramatically different understanding of your health.

While this is an exciting area of healthcare, I think we’d be wise to take a lesson from “big data” in healthcare. Far too many health systems spent millions of dollars building up these massive data warehouses of enterprise health data. Once they were built, they had no idea how to get value from them. Since then, we’ve seen a shift to “skinny data” as one vendor called it. Minimum viable data sets with specific action items tied to that data.

We should likely do the same with geospatial data and population health and focus on the minimum set of data that will provide actual results. We should start with the skinny data that delivers an improvement in health. Over time, those skinny data sets will combine into a population health platform that truly leverages big data in healthcare.

Where do you see geospatial data being used in healthcare? Where would you like to see it being used? What are the untapped opportunities that are waiting for us?

For more content like this, follow Samsung on Insights, Twitter, LinkedIn , YouTube and SlideShare.

Time To Leverage EHR Data Analytics

Posted on May 5, 2016 I Written By

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she’s served as editor in chief of several healthcare B2B sites.

For many healthcare organizations, implementing an EHR has been one of the largest IT projects they’ve ever undertaken. And during that implementation, most have decided to focus on meeting Meaningful Use requirements, while keeping their projects on time and on budget.

But it’s not good to stay in emergency mode forever. So at least for providers that have finished the bulk of their initial implementation, it may be time to pay attention to issues that were left behind in the rush to complete the EHR rollout.

According to a recent report by PricewaterhouseCoopers’ Advanced Risk & Compliance Analytics practice, it’s time for healthcare organizations to focus on a new set of EHR data analytics approaches. PwC argues that there is significant opportunity to boost the value of EHR implementations by using advanced analytics for pre-live testing and post-live monitoring. Steps it suggests include the following:

  • Go beyond sample testing: While typical EHR implementation testing strategies look at the underlying systems build and all records, that may not be enough, as build efforts may remain incomplete. Also, end-user workflow specific testing may be occurring simultaneously. Consider using new data mining, visualization analytics tools to conduct more thorough tests and spot trends.
  • Conduct real-time surveillance: Use data analytics programs to review upstream and downstream EHR workflows to find gaps, inefficiencies and other issues. This allows providers to design analytic programs using existing technology architecture.
  • Find RCM inefficiencies: Rather than relying on static EHR revenue cycle reports, which make it hard to identify root causes of trends and concerns, conduct interactive assessment of RCM issues. By creating dashboards with drill-down capabilities, providers can increase collections by scoring patients invoices, prioritizing patient invoices with the highest scores and calculating the bottom-line impact of missing payments.
  • Build a continuously-monitored compliance program: Use a risk-based approach to data sampling and drill-down testing. Analytics tools can allow providers to review multiple data sources under one dashboard identify high-risk patterns in critical areas such as billing.

It’s worth noting, at this point, that while these goals seem worthy, only a small percentage of providers have the resources to create and manage such programs. Sure, vendors will probably tell you that they can pop a solution in place that will get all the work done, but that’s seldom the case in reality. Not only that, a surprising number of providers are still unhappy with their existing EHR, and are now living in replacing those systems despite the cost. So we’re hardly at the “stop and take a breath” stage in most cases.

That being said, it’s certainly time for providers to get out of whatever defensive crouch they’ve been in and get proactive. For example, it certainly would be great to leverage EHRs as tools for revenue cycle enhancement, rather than the absolute revenue drain they’ve been in the past. PwC’s suggestions certainly offer a useful look on where to go from here. That is, if providers’ efforts don’t get hijacked by MACRA.

Healthcare Data Quality and The Complexity of Healthcare Analytics

Posted on March 2, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

The other day I had a really great chat with Khaled El Emam, PhD, CEO and Founder of Privacy Analytics. We had a wide ranging discussion about healthcare data analytics and healthcare data privacy. These are two of the most important topics in the healthcare industry right now and no doubt will be extremely important topics at healthcare conferences happening all through the year.

In our discussion, Khaled talked about what I think are the three most important challenges with healthcare data:

  1. Data Integrity
  2. Data Security
  3. Data Quality

I thought this was a most fantastic way to frame the discussion around data and I think healthcare is lacking in all 3 areas. If we don’t get our heads around all 3 pillars of good data, we’ll never realize the benefits associated with healthcare data.

Khaled also commented to me that 80% of healthcare analytics today is simple analytics. That means that only 20% of our current analysis requires complex analytics. I’m sure he was just giving a ballpark number to illustrate the point that we’re still extremely early on in the application of analytics to healthcare.

One side of me says that maybe we’re lacking a bit of ambition when it comes to leveraging the very best analytics to benefit healthcare. However, I also realize that it means that there’s still a lot of low hanging fruit out there that can benefit healthcare with even just simple analytics. Why should we go after the complex analytics when there’s still so much value to healthcare in simple analytics.

All of this is more of a framework for discussion around analytics. I’m sure I’ll be considering every healthcare analytics I see based on the challenges of data integrity, security and quality.