As politicians debate how to improve the nation’s expensive — and some would say broken — health care system, Americans are eagerly turning to the latest tech devices in hopes of preventing and detecting medical problems early and avoiding costly trips to the doctor or emergency room.
“Technology every day is playing a more important role in preventing and even diagnosing illness,” said Gary Shapiro, chief executive of the Consumer Technology Association and author of “Ninja Future: Secrets to Success in the New World of Innovation.” “We are just beginning this journey of revolutionizing health care and reducing trips to the doctor.”
Among the new inventions are many that use artificial intelligence, sensors or so-called internet of things connectivity to do a host of groundbreaking tasks, from detecting Alzheimer’s from the sound of your voice to telling breast cancer patients, in real-time, if their chemotherapy treatment is working.
Some patients seek out new devices as if their lives depend on it. And for some, it does.
Diagnostic Technology
Want to publish your own articles on DistilINFO Publications?
Send us an email, we will get in touch with you.
Take Jeff Brue, a tech guru for Hollywood films. Mr. Brue was on his honeymoon in Mexico in 2016 when his spleen ruptured and he was rushed to a hospital in the town of Zihuatanejo for a splenectomy. Upon returning home to Los Angeles, doctors wrongly gave him a diagnosis of angiosarcoma, a rare form of cancer, and began an aggressive chemotherapy regimen to treat it.
“They told me I had a year to live,” Mr. Brue, who was then 34, said. Refusing to blindly accept the prognosis, he frantically searched the internet for data on his symptoms — which included a 105-degree temperature — and rushed the pathology reports to other major hospitals in the United States for additional opinions.
A month later, doctors confirmed an error had been made and said he actually had non-Hodgkin’s lymphoma, which was treatable. But by then, his immune system and liver had been damaged from the wrong chemotherapy treatments and from seven biopsies. He’s now awaiting a liver transplant in Pittsburgh.
Mr. Brue attributes much of the misdiagnosis to outdated imaging tools and poor communication between radiologists and oncologists. “It’s a systemic problem,” he said.
So between biopsies and positron emission tomography, or PET, scans, he set out to try and change the system. Mr. Brue took his OpenDrives digital storage system, which had made him a tech rock star in the film and entertainment industry where he worked on productions like “Gone Girl,” “House of Cards” and “Deadpool,” and brought it to the health care sector.
His system will allow hospitals to store high resolution M.R.I. and C.T. scans, and other 3-D images on its network without having to compress them. Most hospital networks compress stored images, which can make the images fuzzy and potentially cause doctors to miss information, said Chad Knowles, chief executive of OpenDrives.
And it’s fast: A PET scan that might take four minutes to retrieve from a network, now takes five seconds, Mr. Brue said. All of this makes it quick and easy to pull multiple high-resolution images — past and present — from the network, and share them with oncologists and radiologists in different offices, he said. He’s also adding A.I. tools to help with diagnosis.
The Steadman Clinic in Vail, Colo., a leading orthopedic surgical hospital, became the first health facility to bring in the OpenDrives system, and talks are underway with several others, according to Mr. Brue.
Voice Analysis
Voice analysis technology, which can detect mental and physical health conditions, like coronary artery disease, Alzheimer’s and even sleep apnea from the sound of someone’s voice, is another promising area of advancement.
The technology uses A.I. to assess hundreds of metrics — like pitch, tone, pauses, word choices, breathing and how a person describes a photo — to detect problems.
“The manner in which we speak and the word choices we make can be evaluated to accurately detect a growing list of clinical conditions,” said Rich Ross, health care research director at Gartner, a research and advisory firm.
Sonde Health, Winterlight Labs and Beyond Verbal are among the companies developing this technology.
“We look at about 540 different metrics,” said Liam Kaufman, chief executive and co-founder of Winterlight Labs, whose firm focuses on detecting cognitive medical conditions. “We published a number of studies in 2015 and 2016 that found we could predict Alzheimer’s at the time with 82 percent accuracy — and today we’re at about 93 percent,” he said.
Pharmaceutical companies, like Johnson & Johnson, Cortexyme and Alector are using the Winterlight Lab’s voice technology for drug trials, according to Mr. Kaufman, but considerably more research and testing is needed before the technology is available to the masses.
Radiology Advances
An increasing number of researchers are using A.I. to help radiologists make more accurate diagnostic decisions, especially in the area of breast cancer.
About 41,000 women died from breast cancer in the United States in 2018, and women have a 1 in 8 chance of developing breast cancer in their lifetime, according to the American Cancer Society. Early detection is seen as key to survival.
Regina Barzilay, a professor at M.I.T. and a member of M.I.T.’s Computer Science & Artificial Intelligence Laboratory, and Constance Lehman, chief of breast imaging at the department of radiology at Massachusetts General Hospital in Boston, created an A.I. system to improve the detection and diagnosis of lesions seen on mammograms.
Current diagnostic tools make it tough to know definitively whether a suspicious lesion is high risk, benign or malignant, especially if the patient has dense breast tissue, said Dr. Barzilay, who is a breast cancer survivor herself. (She received the diagnosis and was treated in 2014.) This can result in false positive results that lead to unnecessary biopsies and surgeries.
Roughly 70 percent of lesions are benign, 20 percent malignant and 10 percent high-risk following a needle biopsy, and 90 percent of those “high-risk” lesions are found to be benign following surgery, she said.
“This means that every year, thousands of women go through painful, expensive, scar-inducing surgeries that weren’t even necessary,” Dr. Barzilay said.
Her team’s system uses machine learning to detect similarities between a patient’s breast and a database of 70,000 images for which the malignant or benign outcome was known.
She expects that her early-detection technology, which is being used at Massachusetts General, will be tested in 10 to 15 more hospitals by the end of the year.
At Boston University’s Biomedical Optical Technologies Lab, or BOTLab, researchers have created a wearable probe that can monitor, in real-time, if chemotherapy is working on a breast-cancer patient. Patients typically wait weeks or even months to see if treatment is working because M.R.I. scans are too costly to do every day or week and they aren’t particularly good for tracking some treatments, among other factors, according to Darren Roblyer, assistant professor of biomedical engineering at Boston University and the head of BOTLab.
“So there are patients who have been treated for three to six months of chemotherapy with absolutely no benefit, who are suffering the toxic side effects like hair falling out,” Dr. Roblyer said.
This device uses near-infrared spectroscopy, or NIR, light technology to measure the tumor’s hemoglobin, metabolism, water and fat levels in a noninvasive way to determine whether the chemo is working. If not, treatments can be adjusted or stopped.
The American Cancer Society recently teamed up with the Global Center for Medical Innovation’s T3 Labs to invest $100,000 in the project to help bring the technology from the lab to the commercial market. Dr. Roblyer said he expects to begin testing the device on breast cancer patients at Boston Medical Center in the next few months, but commercialization is likely several years away.
“X-ray vision”
Another promising technology is M.I.T.’s so-called “X-ray vision” technology. Yes, X-ray vision — a phenomenon usually exploited by comic book characters or mad scientists in science-fiction movies, but in this case using Wi-Fi and radio waves to see through walls to monitor patients with movement disorders or those prone to falling. Dina Katabi, a professor at M.I.T. and member of M.I.T.’s Computer Science & Artificial Intelligence Laboratory, is testing a wireless smart-home system, called Emerald, that uses A.I., sensors and radio signals to track a person’s movements, sleep stages, heartbeat, breathing, gait and other metrics — even through the walls in a home — as long as Wi-Fi is present. Basically, radio signals bounce off the person’s body, sending the reflection onto the device’s screen — in the form of a stick figure — that walks, sits, stops and moves its limbs just as the person does.
The technology is aimed at making it easy to collect health and motion data in a nonintrusive way to monitor conditions like Parkinson’s, multiple sclerosis or even sleep apnea so that doctors can adjust medications as needed. It can detect side effects or wrong drug doses through changes in heartbeat, breathing or other metrics, and monitor falls by older people who live alone.
“Today, if you have a sleep issue and want to know how much deep sleep you’re getting, you go to a sleep lab, and they put E.E.G. electrodes on your head, other sensors on your body, and then ask you to sleep like that,” Dr. Katabi said. With a “smart” Wi-Fi box, all of this data would be collected without the need for body sensors.
Emerald is used by some pharmaceutical companies to measure safety and efficacy during drug trials, but it’s not yet available to the general public.
Although many of these new technologies have shown promising early results, it could be years, if ever, for some of them to reach the public.
Scientists say it’s been an uphill battle just convincing medical journals to publish studies on A.I. or agencies to fund them, said Dr. Barzilay. The reason? Many don’t have computer scientists sitting on their boards who understand the technology.
“They’re not really equipped to read machine-learning proposals because the vast majority work in other areas like biology or something else,” she said. “I feel I constantly have to fight.”
Dr. Barzilay says the winds are slowly changing though and she believes the technology will ultimately prevail. “It’s absolutely the future; it’s even the present,” she said. “The question is how fast do we adopt it?”
Date: March 6, 2019
Source: The New York Times