Home devices using AI may be able to pick up signs of distress and call for help.
Amazon Alexa and Google Home could be lifeguards – quite literally. A team at the University of Washington (UW) created a new tool to monitor people for heart failure through your home speaker.
When someone experiences cardiac arrest, they become unresponsive and can stop breathing or gassing for air. The tool can detect that gasp, known as agonal breathing, then warns authorities for help.
About 475,000 Americans die each year from cardiac arrest. This happens when the heart suddenly stops beating. Receiving CPR can greatly improve a person's chance of survival, but only if someone is present.
"Just as smart speakers can listen to Alexa, what we show is that they can also passively listen to agonal weather sounds and either raise an acoustic alarm or call emergency services when it detects one," Shyam Gollakota, PhD, a lecturer in UW's Paul G. Allen School of Computer Science & Engineering, Healthline told. Gollakota was one of the authors of a new study looking at whether Alexa can detect signs of cardiac arrest.
The new tool created by UW researchers is a "skill" that resembles an app that is a built-in capability that can be used on voice devices containing artificial intelligence (AI). These voices include the Amazon Echo, whose virtual assistant is dubbed Alexa. Skills can be added to an existing Amazon Echo, Google Home or Smartphone.
The researchers tested the tool, which was developed using real agonal obsessions recorded from 911 calls. They collected data from 162 calls between 2009 and 2017 and created 236 clips from the calls. The recordings were taken from Alexa-enabled, iPhone 5s and Samsung Galaxy S4 devices. After various machine learning techniques were used, the team came up with 7,316 positive clips.
They were played at different distances to simulate possible local differences. They also added sounds that can interfere, for example. Air conditioning or the sound of a pet.
The team then used a negative dataset that had 7,305 sound samples of people from snoring.
The technology was able to detect agonal breathing 97 percent of the time from up to 20 meters away. The results were published in NPJ Digital Medicine.
"We depict a contactless system that works by continuously and passively monitoring the bedroom for an agonal breathing, and warns that everyone nearby will provide CPR. And so if there is no answer, the device can automatically call 911, "said Gollakota in a statement.
He told Healthline that the technology is licensed and could be commercially available in a year or so.
"We also believe that our system will give users a warning before contacting emergency medical services or other forms of support and give them a chance to cancel false alarms" Justin Chan, a PhD student and fellow researcher , who also worked on the study, told Healthline.
The researchers are planning to market this technology through Sound Life Sciences, Inc. They will test it on multiple calls across the country and in other countries.
Chan noted that technology retains privacy. It only runs on a smart device and does not send data to the cloud or a third party. Data is only stored locally for a few seconds as required for treatment and then discarded.
Despite any potential privacy issues, many AI experts find cardiac arrest detection quite discovery.
"What these scientists have done is brilliant and a glimpse of an important development for smart speakers and voice assistants," said Bradley Metrock, an AI and voice expert and chief executive of tech-focused Score Publishing, Healthline.
Medical experts say the results are interesting, but at that time, Alexa may discover something is wrong, it may be too late.
Dr. Robert Glatter, an emergency doctor at Lennox Hill Hospital in New York City, says agonal breathing can only start after brain damage has already occurred.
"Lack of sufficient blood flow to the brain for more than three to five minutes – the time it takes to result in irreversible brain damage – will typically occur well before the onset of agonal breathing," he explained.
He says further studies can help uncover other biomarkers that could help record warning signs much earlier.  "I think the concept of weather monitoring in a contactless way to detect a heart attack is an ongoing work," Glatter said. "As early intervention is crucial to the attempt to save lives, we may need to evaluate biomarkers apart from agonal breathing, as the appearance is a sign that death is imminent."
It may seem odd to use a robotic device as a means of encouraging help, but AI is increasingly being used as a way to improve people's health and discover when something is wrong.
The reason is that AI is particularly good at registering patterns, explained Marius Kierski, a partner with Sigmoidal.
His company specializes in machine learning and AI.
"I think AI progress has a long way to do to turn it into everyday devices because of the legal requirements," he told Healthline.
Alexa also has a home security capability that can listen to sounds like glass breaks when you are not at home. You just put your Amazon Echo (or Echo Dot) in security mode and tell you what you're leaving. It listens.
In addition, Alexa-enabled devices already have a wide range of health-related skills. Expresscripts, for example, let patients check the status of home delivery and receive messages when orders are shipped. Atrium Health, a healthcare system in the south, allows people to find emergency care near them and schedule an appointment on the same day.
"You have no personal concerns as you are not home to being canceled," says Freddie Feldman, CEO of VocoLabs. He creates Alexa skills and other conversational interfaces that can help patients. "It's a little different [than the cardiac arrest skill technology] but much the same because they use AI to detect a particular pattern of sounds and then act on it."
"I think the progress is really good and interesting" he added. "Having a unit in the home that is connected and" always listening "is actually an advantage in a case like this."
Henry O & # 39; Connell, director and CEO of Canary Speech, believes that these applications will primarily be used in hospitals and in clinical trials rather than the home
O & # 39; Connell's company creates technology that integrates AI and health education. They are working to develop disease classification tools for Parkinson's and Alzheimer's disease, as well as conditions such as anxiety and depression.
Connell said hospitals and clinical trials may be better suited to AI voice applications. This is partly because the doctors in these places must be very clear about how their data will be used to get informed consent from the patients. Only then can their speech data be used to evaluate them for diseases.
Other AI and health applications are in the pharmaceutical industry using AI in drug research.
There is even a chance radiologists can soon hire AI. A 2018 study found AI can beat dermatologists to detect some signs of skin cancer.