At first it was a novelty: Hospitals began using voice assistants to allow patients to order lunch, check medication regimens, and get on-demand medical advice at home.
But these devices, manufactured by Amazon, Google, Apple, Microsoft and others, are now making deeper inroads into patient care. Hospitals are exploring new uses in intensive care units and surgical recovery rooms, and contemplating a future in which Alexa, or another voice avatar, becomes a virtual member of the medical team — monitoring doctor-patient interactions, suggesting treatment approaches, or even alerting caregivers to voice changes that could be an early warning of a health emergency.
“Why not have a connected speaker in the room listening to conversations?” asked John Brownstein, chief innovation officer at Boston Children’s Hospital, which is piloting dozens of voice applications. Voice technology still remains at the edges of patient care, he added, but the hospital is already using it to improve the efficiency of ICU care and help prepare doctors for transplant surgeries.
In New York, Northwell Health is preparing to put Alexa in private rooms next month to allow patients to tap into their medical records. And Mayo Clinic is using voice to deliver wound care instructions to some surgical patients and is studying the technology’s ability to diagnose cardiovascular disease and other conditions.
Want to publish your own articles on DistilINFO Publications?
Send us an email, we will get in touch with you.
Underlying that work is an increasingly fierce competition for health care dollars among giant technology companies and scores of startups that are investing heavily in voice-enabled products and services. Clinicians are waiting to see which of the largest companies will be the first to introduce a smart speaker that fully complies with health care privacy laws, a step that would allow them to delve even deeper into patient care.
With artificially intelligent voice assistants rapidly becoming pervasive in the consumer marketplace, many people are familiar with the devices and may soon expect the same convenience in health care settings.
More than 100 million Alexa-enabled devices have been sold, while Google Assistant, which gets a big boost from its use in Android smartphones, was expected to be in 1 billion devices by the end of January, according to a company blog post. A recent study by Nielsen reported that nearly 25 percent of U.S. households own a smart speaker, a rapid uptick during the past year.
“We believe that the technology that exists in patients’ homes will be a demand that patients will have sooner than later,” said Dr. Vishwanath Anantraman, chief innovation architect at Northwell Health, adding that the hospital is planning to introduce several new uses for voice technology and bots, which run automated tasks over the internet, during the next few months.
“Voice tech can help improve service requests and deliver real-time analytics to the staff to ensure patient satisfaction and patient safety,” he said. For clinicians, the hospital system is focusing on retrieving complex information instantly on voice and mobile devices.
Nathan Treloar, president and co-founder of Orbita, a Boston-based company working to create voice-enabled services in health care, said he is starting to see a shift in the way hospitals are using voice. Most began with simple consumer-facing services, such as voice programs that educate patients on the symptoms of diseases, he said. But now they are embedding them into the process of delivering care.
One of the emerging uses, Treloar said, is deploying a voice assistant as an alternative to the nurse’s call button, so that a patient can ask for help with specific problems and get more timely service.
“The virtual assistant can collect enough context about the patient’s need so it can be properly prioritized,” Treloar said, adding that such uses are likely to expand rapidly with the introduction of devices designed to comply with U.S. patient privacy laws.
Several startups have already created HIPAA-compliant voice software for use with electronic medical records systems. Sopris Health, a Denver-based company, developed a product designed to automatically convert a doctor-patient conversation into text that is then loaded into a doctor’s note. Other competitors in the field include Suki, Notable, Nuance, and Seattle-based SayKara, which is led by former Amazon engineers.
Health care technology specialists said these types of products could make electronic health records easier to use and may help ease documentation burdens. But they are unlikely to result in a revolution in record-keeping any time soon.
“I do not think in the near, medium, or long future the EMR is going to be replaced with a voice-enabled application,” said Darren Dworkin, chief of information at Cedars-Sinai, a Los Angeles-based health system. He added: “Like many technologies before this, the important part will be that we don’t get too far ahead with the hype. Voice is a wonderfully empowering technology, but we have to figure out how it finds its rightful place.”
In 2017, Cedars-Sinai invested in a voice company called Aiva Inc. and took a small equity stake in the firm as part of an accelerator program the hospital created to explore new technologies. Dworkin said Cedars-Sinai is particularly interested in using the technology to help deliver care instructions and notifications to patients at home.
Meanwhile, Aiva has also attracted capital from the Google Assistant Investment program and Amazon’s Alexa Fund — vehicles the companies use to invest in early-stage companies.
Mayo Clinic, one of the pioneers of voice in health care, built an Alexa-enabled program to deliver first aid instructions to consumers. More recently, it has begun piloting the use of the technology to deliver post-discharge instructions to patients recovering from surgeries to remove skin lesions.
Dr. Sandhya Pruthi, medical director of global business solutions at Mayo, said its study on the use of voice to diagnose cardiovascular disease — it found analyzing speech signals such as tone and intensity could help detect coronary artery disease — points to an exciting future for the technology. Several companies, such as Sonde Health, are developing diagnostic tools based on changes in a person’s voice. The hope is that analyzing subtle shifts in tone, clarity, and cadence will help predict the onset of psychotic episodes, stroke, and other health emergencies.
“It opens possibilities to deliver care at a distance,” Pruthi said. “Think about people living in small towns who aren’t always getting access to care and knowing when to get health care. Could this be an opportunity if someone had symptoms to say, ‘It’s time for this to get checked out?’”
She said she attended the recent Alexa voice conference in Chattanooga, Tenn., where other attendees were talking about the use of voice to detect symptoms of Parkinson’s disease and autism, “It’s still very early,” Pruthi said. “You do need to do the work. There has to be evidence that this makes sense.”
Date: February 20, 2019
Source: STAT