Michael Pfeffer, MD, FACP, offered Hospital Medicine 2018 attendees an example of how advances in health care technology may soon change their clinical conversations:
“EHR, I need a CT scan of the head with contrast.”
“Thank you, Doctor, but do you really need contrast with this? We don't think so.”
“Oh, OK, you're right. I'll get a CT scan of the head without contrast.”
“I think you're going to see a lot more of that in the next five years,” said Dr. Pfeffer, an associate clinical professor at the University of California, Los Angeles (UCLA), and assistant vice chancellor and chief information officer for UCLA Health Sciences, during his lecture, “The Future of IT for Hospitalists (and How to Leverage It).”
The paradigm change he predicts may sound worrisome to some, but he also offered reassurance. “One of the advantages about being a hospitalist is that tech companies are not focusing on using artificial intelligence [AI] to replace what we do. There is certainly a focus on using AI in the interpretation of radiology films and pathology slides. So that's, I think, a good thing,” Dr. Pfeffer said.
Hospitalist leader Robert M. Wachter, MD, FACP, offered a similar vision in his annual closing talk at the conference. “I think our jobs are safe,” he said, noting that his stepdaughter will soon be choosing a medical specialty. “If she says radiology, dermatology, or pathology, I'm going to give her a little talk about the future. Do you really think there will be human beings reading X-rays 20 years from now?”
Dr. Wachter's guess is no, but he does expect that “there will be people like us taking care of sick people in buildings called hospitals” for at least the next couple of decades. However, computers will change hospitalists' work in a number of significant ways, the experts agreed, describing what they see as the innovations mostly likely to enter mainstream practice.
Dr. Pfeffer's imagined conversation with a future version of a device similar to Amazon's Alexa, Microsoft's Cortana, or the Google Home captures two concepts the experts both expect to see more of: voice recognition and diagnosis guided by artificial intelligence.
Dr. Wachter, a professor and chair of the department of medicine at the University of California, San Francisco, offered a similar scene. “You and the patient have a conversation and [the computer] is not just audiotaping and transcribing the conversation—that's not helpful—it is taking it in and then contexualizing it, putting it into the right parts of the chart.”
The computer could then suggest diagnoses and comb the electronic health record (EHR) to provide the physician with relevant data. For example, if the patient's symptoms were consistent with heart failure, “the system is going back into the patient's medical record and pulling up a prior echocardiogram or a prior cath,” said Dr. Wachter. “It won't just be going through your [facility's] own medical record. It will be going through the medical record of the patient over time if they've been at other places.”
The idea of anyone—human or machine—being able to instantly review a complete medical record may sound incredible, but the long, slow road to interoperability is nearing its end, thanks to Fast Healthcare Interoperability Resources (FHIR, pronounced “fire”), the experts said.
“FHIR is an health interface standard that, when paired with application programming interfaces, can speak with electronic health records to allow patients to download an app that uses this technology and extract information out of electronic health records,” Dr. Pfeffer said. “As apps start to do this, which will be allowed due to government regulation, whether or not you have Open Notes, patients are going to be able to get information from your EHR in real time.”
HealthKit, an app available on iPhones, can already gather data from some systems, and drawbacks have emerged, he noted. “If a patient downloads their record from, say, three hospitals, how many medication lists are they going to have if they have been to three hospitals? Yeah, like four,” Dr. Pfeffer said. “I've already heard stories where such apps have extracted data from two records and one was for metoprolol 25 [mg] and one was for metoprolol 50 [mg] and the patient ends up taking 75 [mg] now and ended up in the emergency room.”
The health care system will have to figure out how to avoid such mistakes, as technology gradually gives patients more access to and control over the chart, he noted. “We're going to have to learn how to deal with this,” said Dr. Pfeffer. “Eventually the patient will grant us access to the record and there will be only one medical record for the patient.”
EHRs will likely also include different forms of data collected with technologies like consumer device integration and vendor-neutral archives. “This is the idea that you can have anything related to the patient that's not text all in one place: endoscopy videos, dermatologic photos, echocardiograms, exercise data, food purchases,” Dr. Pfeffer explained.
Computer systems may gather data from even farther afield in their efforts to assist with diagnosis, according to Dr. Wachter. Until the recent news about Facebook data being used for political purposes, he would have expected social media to be a source. “I think we've seen that take a pretty big step back in the last week or two, probably appropriately until we sort all this stuff out,” he said.
However, artificial intelligence systems will likely base their findings on both the conclusions of existing medical research and the raw data of an entire electronic health record. Dr. Wachter described how the latter would work: “We're sifting through the 10,000 patients who have been seen at your institution in the past year, and those of them that had these symptoms, signs, and lab tests, turned out to have this diagnosis.”
Computers may also take on some hands-on medicine. Dr. Pfeffer predicted a continued rise in hospitalists' use of handheld ultrasounds. “But what's going to happen I think in five years is that these things will automatically tell you what the answers are....Similar to the automated external defibrillator, it'll say put the probe here, and then using artificial intelligence and validated algorithms, it'll tell you what the ejection fraction is, how distended the inferior vena cava is, and other things about the patient, transforming the physical exam.”
Computers will also tell you when you're wrong, perhaps with a polite statement like “It's very unusual to see a patient with a diagnosis of GI bleeding still on an anticoagulant,” Dr. Wachter said.
They'll also watch for cases where the diagnosis itself may be incorrect, he added. “The computer will be able to say here's what the expected trajectory looks like and tell you the patient is veering off that trajectory: ‘You thought the patient had cellulitis and pneumonia. Patients like this are usually afebrile by now.’”
Things to go away
Although that prospect may sound like yet more unwanted alerts on the way, both speakers had optimistic predictions on that front. The secure, EHR-integrated texting programs that are replacing pagers will include alerts, but they will be “smart,” said Dr. Pfeffer. “So you don't just get alerts for every lab that comes back, but you get alerts for the lab that you want.”
Patients will have their alarm fatigue reduced as well, when medical devices join the “internet of things,” becoming as internet-connected as new household appliances. “The number one patient complaint in our hospital: pumps beeping in the middle of the night. There's no reason why pumps have to beep; they could simply alert someone to come into the room,” Dr. Pfeffer said.
Dr. Wachter envisions the creation of a whole new job dealing with the rising number of alerts, alarms, and predictions: a “care traffic controller.” These controllers—who could be doctors, nurses, or some other type of clinician—might watch over as many as 100 patients at the same time.
He described how this might work: “It looks like for patient 14, the risk of a bad outcome just increased, and I think I'll investigate that. Shall I call the rapid response team, or should I speak to the primary team, or do a little chart review?”
Both speakers also identified some medical careers, besides radiologist and pathologist, that they expect to be replaced by technology in the future. Dr. Wachter predicted that medical scribe, despite recently being a rapidly growing occupation, is set to become a shrinking one. “I think scribes are a transitional and very expensive solution,” he said. “I think this function gets replaced by artificial intelligence.”
Patient throughput and capacity management were on Dr. Pfeffer's radar for significant technology advances. “There's no reason why we need a whole army of people to do this,” he said. “I think this is going to be the first use of artificial intelligence that we're going to see in full play in the hospital...You're just going to have a computer system that basically puts the patients in the proper location, which as hospitalists I think will be great.”
Whether all these predicted technological changes seem great or terrible to you, Dr. Pfeffer also offered a reason not to be too certain about their imminent arrival—the Hype Cycle for Healthcare Providers, developed annually by Gartner, a research and advisory company. “It's like the five stages of grief for technology in health care,” he said. The first stage is an innovation trigger (in other words, a great idea for an invention to fix a health care problem).
“Then you have the peak of inflated expectations—not only is it going to do all that, but it's going to cure cancer,” said Dr. Pfeffer. The next step is the trough of disillusionment, followed by the slope of enlightenment, and finally the plateau of productivity. “This is where it actually works. Not all the dots make it all the way to here,” he said.