Imagine entering the front door of your hospital to start your busy workday as a hospitalist. The smart hospital, with its smart technology, recognizes you by “reading” your name badge via radio frequency identification.
Your smartphone sends you a real-time list of the patients who were admitted overnight—prioritizing those who are sickest, most urgent, and located closest to where you are now, with directions for how to get to their rooms. You don’t need to stop at a computer screen, because your phone offers to help you review essential data in the prioritized patient’s chart.
When you walk into the patient’s room, the system again recognizes you, flashing your name, picture, job title, and role on the “foot wall”, a 75-inch screen mounted on the wall at the foot of the patient’s bed. This is the same smart TV screen used to show the patient high-resolution test results such as X-rays, as well as offering them email, television, video games, and room environmental controls.
Are any of your patients trending toward sepsis or a CPR code? Have automated triggers identified those whose blood sugar levels are trending unsafely? Who is most at risk for placement in intensive care, or for readmission? Which patients in the emergency department are most likely to be admitted for a worsening COVID-19 infection? Subtle changes, which can be picked up by the machines faster than any doctor, could help to prioritize your interventions.
For Subha Airan-Javia, MD, FAMIA, a hospitalist with the Penn Health System in Philadelphia, these kinds of connections are not far in the future—at least in terms of the technology, much of which already exists. She is also the founder and CEO of CareAlign, a company that is helping to build the technological infrastructure to facilitate interactions between hospital clinicians and their smart technology.
There’s a lot of excitement for what artificial intelligence (AI) can do to transform health care, improving outcomes and patient experience. “But a lot of legwork needs to come before AI can reach its promise,” Dr. Airan-Javia said. How soon—and how—will these transformations trickle down to hospitalists on the floor?
What is AI?
AI in medical settings is commonly defined in terms of applications of cognitive technologies processed by a computer that can mimic human cognition, learning, and decision making—the ability to learn and reason in ways that resemble humans. AI is not one technology but a collection of technologies, and some of its medical targets include virtual assistants, decision support, imaging interpretation, remote monitoring, and predictive analysis.
Often people blur the distinctions between AI, machine learning—which refers to algorithms that can learn without being explicitly programmed—and pattern recognition, Dr. Airan-Javia said. “A lot of what we see now is really more machine learning and pattern recognition, less AI. To me, AI refers to actual computer intelligence, where it takes large amounts of inputs and turns them into new insights beyond what it was programmed to do.” She added that such distinctions may not be important to the working hospitalist.
Meanwhile, virtually every hospitalist in the country still carries a piece of paper on the job with a list of patients and tasks for the day, she said. “It’s funny, we want AI to come in and do all these different things. But we’re still analog in the ways we do much of our work.”
A key to overcoming the hospitalist’s disconnect from the computerized patient record is to digitize and integrate work that is now done largely on paper. Natural language processing can turn charting notes and other texts into data that could be combined with other data in the medical record, she said.
“But that’s just the beginning. What about electronic and paper sticky notes, faxes, emails, text messages, in-basket messages, Excel files—all the different ways clinicians try to keep track of what they’re doing, things that are not part of the medical record, and none of them are integrated?” Dr. Airan-Javia said. How can structured data be created out of these unstructured data, with the clinician only having to type it once—or not at all?
“Then let’s broaden our scope to think about all the ways this data can reach the clinician beyond the scope of the electronic health record (EHR)—all the ways we can bring that information and knowledge to the clinician’s workflow in real-time.” Hospitalists now go to a desk with a computer screen to interface with the EHR, but what about technology like the Google Glass brand of smart glasses, virtual reality devices, the Apple Watch, or the frequent advances in iPhones?
“Not that I see doctors walking around the hospital with virtual reality headsets anytime soon, but I think we’ll see innovations in the ways we interface with the various technologies,” she said. “We shouldn’t have to stop at the computer station for this information.”
Dr. Airan-Javia’s company, CareAlign, is focused on how to design technology to make these interfaces work better and correct inefficient workflows. It’s not technically AI, she said, but a step in that direction. The product includes a collaborative team task-management platform, letting everyone on the team know what needs to get done for the patient, and by whom, and tools focused on clinical workflow, including data visualization.
Penn Medicine already has AI-based innovation in its new 504-bed, 17-story pavilion, a “hospital of the future” with the latest current technology, which opened in November 2021 after six years of intensive planning. It includes the interactive, foot-wall TV screen and care system mounted in patient rooms, along with other advanced technologies such as hybrid operating rooms equipped with MRIs and extensive use of health care robots.
Sara Murray, MD, MAS, associate professor of clinical medicine in the division of hospital medicine at the University of California San Francisco and associate chief medical information officer for inpatient care and data science at UCSF Health, said hospitalists may not even see or realize all the ways AI influences their work.
“A lot of effort at UCSF is now going into optimizing clinical operations, leveraging AI to improve patient flow through the hospital, optimize room turnover, and the like,” she said. The goal is to improve the experience for patients and reduce the burden on providers. In an ideal world, doctors can focus on what they do best, which is to understand the complex nuances specific to the individual patient and communicate that with the patient to achieve shared decision-making. The technology is meant to augment humans, not make decisions for them, she said.
UCSF, like many health care organizations, is setting up a command center to bring together all the staff involved in transfer management, bed control, patient flow, capacity management, length of stay, and throughput, she said. Members of this team will all be in the same place, working from the same AI-embedded information, including predictions for when beds might be opening up.
Many health systems have algorithms to predict the clinical deterioration of hospitalized patients, Dr. Murray said. A lot of those algorithms aren’t particularly sensitive or specific. The problem is when the system flags numerous patients who aren’t deteriorating and misses some who are. “We have to be smart about who we are alerting, what we are asking them to do with that information, and whether the algorithm is going to identify enough patients correctly while avoiding too many false positives.”
How a particular clinician delivers care—how they interact with their patients—can provide the basis for new tools like intelligent scribes. These scribes capture the doctor-patient conversation in natural language and know which part of the conversation belongs to the clinical note versus the part that’s just about connecting with the patient, said Nasim Afsar, MD, MBA, MHM, chief health officer at Oracle Health in Los Angeles. “As AI becomes more sophisticated, it learns how you work, rather than asking you to learn to speak the computer’s language.”
Machine learning can augment the clinician’s diagnosis by looking at tens of thousands of other patients and millions of data points, Dr. Afsar said. She believes the EHR and AI will be seamlessly combined in the future. “Machine learning will unlock the potential of big data and make it work for us and be more predictive.”
Dr. Murray agreed that AI has the potential to assist workflow—streamlining chart review so physicians can more quickly have the most important information in front of them. “But I’m never going to give up my single piece of paper. I just think there’s something nice about rounding with a pen in hand.”
The clinician’s participation
For Jonathan Chen, MD, PhD, hospitalist, and physician data scientist at Stanford Health Care in Stanford, Calif., AI has great potential to assist human cognition while reducing the demands on human diagnosticians—although current AI tools have not yet realized this potential. “AI can be beneficial when treating 20 hospitalized patients. Who needs my attention the most? What if that prioritizing were more objective?” Dr. Chen co-authored a recent Journal of the American Medical Association article about how AI-trained systems learning from experience can become more important tools in diagnosing patients’ health problems.1
Too often today, he said, the EHR casts the physician as the highest-paid data entry clerk in the health system, spending hours entering—and re-entering—information into the record. If frontline clinicians don’t want to get beaten up by the new applications of AI, as many feel they were by the rollout of the EHR, they will want to get involved in planning for its implementation.
His colleague Ron Li, MD, a clinical assistant professor of hospital medicine and medical informatics director for digital health at Stanford, described using AI to employ collaborative team workflows for improving advance care planning and for decreasing unplanned care escalations in a 2022 New England Journal of Medicine Catalyst article.2 Who, in the next 6 to 18 hours, is most likely to go to the ICU? If AI concludes that the patient is at risk, then the system nudges the patient’s nurse and physician to talk to each other. “We’ve aligned the system to send the alert to the doctor and the nurse at the same time.”
Dr. Li, who cofounded and directs the Stanford Emerging Applications Lab, said he works alongside more technologically minded people, trying to represent the clinician’s voice in the design of technology products—not just for AI but for digital health more broadly.
EHR systems too often were developed without clinician participation, and without enough attention to the clinician’s functionality needs, he said. “That is a risk for AI, too, although it has the capability to become ever more useful. I want to turn the table on this process so that the clinicians are driving the technology, rather than the other way around. How can we elevate the clinician’s voice so AI becomes their tool?”
Larry Beresford is an Oakland, Calif.-based freelance medical journalist, specializing in hospice and palliative care, and a long-time contributor to The Hospitalist.
- Chen JH, et al. Decoding artificial intelligence to achieve diagnostic excellence: Learning from experts, examples, and experience. JAMA. 2022;328(8):709-10.
- Li RC, et al. Using AI to empower collaborative team workflows: Two implementations for advance care planning and care escalation. NEJM Catalyst. 2022; 3(4). DOI:https://doi.org/10.1056/CAT.21.0457 Published online March 16, 2022. Accessed December 2, 2022.
This article was originally published by The Hospitalist