Two weeks ago, there was a study published in JAMA, led by Harvey J. Murff and colleagues that validates natural language processing (NLP) technologies as a powerful tool to unlock data (meaning) from electronic health records (EHRs). However, the use of language understanding technologies, such as NLP, in healthcare is not limited to information queries, as referenced in the study. In fact, the results of the Murff/JAMA study set the stage for a broad spectrum of healthcare scenarios in which we can apply understanding and intelligence technologies to improve patient care, reimbursement and efficiency.
We Can't Understand, What We Don't Know
Things like tracking trends in patient care or shaping treatment decisions through better, real-time information cannot happen if we don't have meaningful data to analyze and drive informed decisions. Therefore, a fundamental component to the success of NLP in healthcare is high-quality documentation. Today, 80 percent of clinical documentation is unstructured - free-form text that is buried within EHRs. Locked within this free-form text is an extraordinary amount of key clinical data - valuable information that can and should be leveraged to make better clinical decisions. To date, unfortunately, the healthcare industry has struggled to unlock meaning from it without intensive, manual analysis.
The Challenge of EHR Documentation
Documentation within an EHR is not as simple as one might think. As I referenced in my last blog, many doctors work to "limit the pain" of the EHR. Doctors can document easily and fast by speaking, which you can see via this link, but speaking alone creates "a text blob" that traps information within the EHR, as referenced above. As an alternative, inherent as part of EHR systems are documentation templates that through point-and-click and pull-down menu options, doctors can create structured documentation; this is easier to analyze and pull facts from, but has proven to be an unnatural means of documentation for doctors and does not capture the nuances of each unique patient story. In fact in 2009, 96 percent of 1,000 surveyed physicians said they were "concerned" about "losing the unique patient story with the transition to point-and-click (template-driven) EHRs."
How NLP Can Help
NLP applied to the medical domain is called Clinical Language Understanding or CLU. The difference between NLP and CLU is that CLU works off of a complete, highly granular medical ontology, which has been tuned to relate and identify all kinds of medical facts so that the underlying NLP engine can "understand" what the caregiver is saying. For example, CLU knows that "cancer" is a "disease."
In order for physicians to qualify for government incentive payments associated with adopting and using EHRs they must capture specified facts, including things such as allergies and vital signs. These facts are oftentimes easy for a physician to capture through a narrative description (via voice), but can prove difficult and time consuming to capture via an EHR system template. For example, saying a patient is taking a certain medicine is simpler than finding the associated prescription pull-down menu and selecting the corresponding drug, dosage, route and frequency with several clicks of a mouse. The EHR documentation conundrum thereby becomes a double-edged sword - doctors can document easily and fast by speaking, but "the text blob" that speaking creates traps information, renderings it unusable because data outside of a structured format is not actionable. Beyond the issues with the mechanics associated with entering data within structured EHR formats, pure structured representation of the patient story falls short of what a care team requires to deliver optimal care. Natural speech-driven documentation, combined with CLU delivers a means for physicians to tell a complete patient story with all its subtleties and makes available all of the clinical facts needed for the EMR to operate in an optimal way. Based on what a doctor says, the CLU engine can understand and auto-populate the EHR with that information; if a doctor says "amoxicillin," CLU knows "amoxicillin" is an "antibiotic" and would input that information into the proper place within the EHR. CLU allows doctors to be efficient with documentation, helps to ensure patients' medical records are comprehensive and are not reduced to purely structured content created by point-and-click templates, and supports healthcare organizations to comply with government regulations, including the HITECH act so that care can be optimized and reimbursement can be maximized. CLU delivers the best of both worlds. To get a better idea of how CLU works, you can watch this video:
CLU's Direct Impact on Patient Care
With the JAMA study, NLP was applied retrospectively and used to query data for information for broad patient analysis. In this scenario, it is far more difficult to exploit real-time opportunities to impact patient outcomes because, analysis occurs after the patient has left. With advancement, that is taking place today, CLU solutions will move toward decision-support that will provide immediate feedback to physicians at the point-of-dictation. For example, if a doctor is documenting a prescription for a patient within the EHR and CLU technology is running in the background, the system might notify the doctor that the patient could have an adverse reaction to that drug and would recommend an alternative. This is one of many examples.
Removing Pain from Billing
Today, if a doctor is vague with documentation they might get a phone call three weeks later from a medical coder who is trying to code their documentation for billing purposes. Chances are the doctor won't fully remember the extra detail that should have initially been captured and the exchange will be burdensome and ineffective. By applying NLP to the documentation process, CLU can scan and understand what the doctor is saying and ask for added specificity or severity when necessary. For example, if a doctor says a patient had a "fracture of forearm," did they mean lower forearm, right or left forearm, and what was the severity? By prompting the physician while the details are fresh in his/her mind, the end document will be more complete, which results in improved care, better cross-care communication, more accurate billing and eliminates that phone call three weeks down the road. Likewise, for the medical coder, CLU can be used to scan and understand electronic medical records and help to auto-code information based on what is documented. For example, what was once dictated as "fracture of forearm," was appropriately elaborated on to become "torus fracture of lower end of right radius," and would be coded "S52.521" based on ICD standards.
NLP to Evolve Beyond a Query Tool
Clinical language understanding innovation has the ability to analyze the meaning and context of human language, and quickly process information to find precise answers that can assist decision makers, such as physicians, unlock important knowledge and facts buried within huge volumes of information. In time, CLU will become a well-known, primary component of the point-of-care process, providing doctors with real-time information about a patient that they're documenting a medical record on behalf of, and will guide physicians to include the most thorough and accurate patient information as they are dictating their notes.
i Murff HJ, FitzHenry F, Matheny ME, et al. Automated identification of post¬operative complications within an electronic medical record using natural lan-guage processing. JAMA. 2011;306(8):848-855.Results of Physician Study
ii Results of Physician Study, http://www.nuance.com/healthcare/physician-study/, 2009.
Follow Janet Dillione on Twitter: www.twitter.com/NuanceHealth