By AI Trends Staff
AI is being increasingly incorporated by doctors to transcribe, read, analyze, and make predictions based on notes and conversations between physicians and their patients. This opens up new possibilities for care and new concerns about privacy, according to a recent account from Axios.
A big and largely invisible contribution AI can make is to capture a physician’s written or spoken notes automatically. Spending hours entering data manually into electronic health records (EHRs) is not helpful to medical professionals close to burning out.
A recent study from researchers at the University of New Mexico, outlined in EHR Intelligence, found that 13% of stress and burnout self-reported by physicians were directly correlated to EHRs. Philip Kroth, MD, director of Biomedical Informatics Research at UNM, found that 40% of clinician stress is related to clinical process design and structure, both of which are very much correlated with EHRs
“We are losing the equivalent of seven graduating classes of physicians yearly to burnout and, as they leave the profession, they point their finger at the time now required for them to document their work and how it has led to the loss of quality time spent with patients and families,” Kroth stated. “In many ways, physicians are finding that the goals of a traditional medical record have been hijacked.”
The UNM researchers worked with others from Stanford University, University of Minnesota, Hennepin County Medical Center, and the Centura Health System in Colorado and Texas to survey 282 clinicians on the impact of EHR completion on stress and burnout.
Ten Years into EHRs, Time Spent on Medical Record-Keeping Has Doubled
The promise of the Health Information Technology for Economic and Clinical Health (HITECH) Act, signed in 2009, was to make paperwork easier. Ten years later, the survey shows that the time set aside to medical record-keeping has doubled. Physicians now spend two minutes at the computer for every one minute spent with patients, and the workdays have extended into the physicians’ home lives.
“It often takes a 60-hour week just to keep up with documentation, and that is tough on personal relationships and families,” Kroth stated. That opens the door for AI solutions that bring more automation to the process.
Real value comes from data captured from the conversations doctors have with patients or from their case notes. Some AI products can “extract information and then contextualize it” in ways that doctors can act on, stated Duane A. Mitchell, director of the University of Florida Clinical and Translational Science Institute, in the Axios account.
One example of value is the time needed to identify the right set of patients to enroll in clinical trials. The manual process takes weeks of manually extracting information from databases, while the AI models can do the work “within minutes,” stated Mona Flores, Global Head of Medical AI at Nvidia.
GaterTron Model Produced at U of Florida in Partnership with Nvidia
Researchers with the University of Florida’s academic health center announced on April 8 that they have collaborated with Nvidia researchers to create GatorTron, an AI natural language processing model that can extract insights from massive volumes of clinical data—very quickly.
The GatorTron Language Model is the first step forward in the $100 million public-private partnership announced last year by UF and NVIDIA, to pursue research in AI and supercomputing, according to a press release on the site of UFHealth of the University of Florida
For the creation of GatorTron, UF Health supplied 10 years of anonymized data from more than two million patients, and 50 million patient interactions across an array of medical specialties, including oncology, internal medicine and critical care. Security controls to protect the privacy of patients’ data during the development of GatorTron were approved by the UF Institutional Review Board and UF Health Information Technology.
“GatorTron is an example of the discoveries that happen when experts in academia and industry collaborate using leading-edge AI and world-class computing resources,” stated David R. Nelson, M.D., senior vice president for health affairs at UF and president of UF Health. “Our partnership with Nvidia is crucial to UF emerging as a destination for AI expertise and development in health research.”
In other recent AI and healthcare news, Microsoft has announced the acquisition of Nuance Communications, specialists in voice recognition software technology, with products that can transcribe and analyze voice conversations between doctors and patients.
Nuance last year announced Dragon Ambient eXperience (DAX), a product that works in tandem with EHR systems to capture and put in context physician-patient conversations, according to a press release from Nuance.
Nuance DAX is said to leverage Dragon Medical, used by over 500,000 physicians worldwide, to create a voice-enabled exam room environment. Nuance worked with Microsoft to incorporate Microsoft’s cloud capabilities in the offering.
SSM Health, a Catholic non-profit integrated health system serving communities throughout the Midwest, planned to pilot this technology in some of its specialty clinics in St. Louis, Mo., Oklahoma, and Wisconsin. “With the Nuance Dragon Ambient eXperience solution, our providers can spend more time with their patients and less time on administrative tasks,” stated Ann Cappellari, MD, Vice President and Chief Medical Information Officer, SSM Health. “This helps providers and patients communicate more clearly and build stronger relationships.”
Mayo Clinic’s RDMP Aims to Connect Remote Health Devices to AI Support
The nonprofit Mayo Clinic in Rochester, Minn. on April 14 announced the Remote Diagnostic and Management Platform (RDMP) platform, aimed at helping healthcare providers improve their use of connected health devices for programs including remote patient monitoring,
RDMP connects devices to AI resources that would help providers with clinical decision support and diagnoses in what the Minnesota-based health system calls “event-driven medicine,” according to an account in mHealth Intelligence.
“The dramatically increased use of remote patient telemetry devices coupled with the rapidly accelerating development of AI and machine learning algorithms has the potential to revolutionize diagnostic medicine,” stated John Halamka, MD, president of the Mayo Clinic Platform. “With RDMP, clinicians will have access to best-in-class algorithms and care protocols and will be able to serve more patients effectively in remote care settings. The platform will also enable patients to take more control of their health and make better decisions based on insights delivered directly to them.”
Change is also coming as AI is applied to the delivery of mental health services. “Clinical psychiatry occurs in very much the same way as it did 100 years ago, where a clinician will sit down and talk to a patient and based on that conversation, develop a treatment plan,” wrote the psychiatrist Daniel Barron in his forthcoming book, “Reading Our Minds: The Rise of Big Data Psychiatry,” as described in the Axios account.
Barron envisions a future instead in which those conversations can be recorded by AI models that can analyze patient speech and even facial expressions for clues about mental illness and how to treat it. This of course would require a patient comfortable with the idea of the AI listening in and analyzing the conversation with the therapist—which could be a stretch for many.
Read the source articles and information from Axios, from EHR Intelligence, from a press release on the site of UFHealth of the University of Florida, a press release from Nuance and an account in mHealth Intelligence.