“Alexa, monitor my heart.”

Amazon Echo Dot. Image by Niran Kasri from Pixabay

From University of Washington press release

Researchers at the University of Washington have developed a software application that could enable smart speakers — like Google Home and Amazon Alexa — and smartphones in people’s bedrooms to listen for signs of a heart attack and call for help. 

Almost 500,000 Americans die each year from cardiac arrest, when the heart suddenly stops beating. Immediate CPR can double or triple someone’s chance of survival, but that requires a bystander to be present and many heart attacks occur at home where the individual is alone and no one is nearby to help.

The new application is designed to identify a particular pattern of breathing, called agonal breathing, that occurs in about half of people experiencing a cardiac arrest.

Agonal breathing is caused by low blood-oxygen levels, said Dr. Jacob Sunshine, an assistant professor of anesthesiology and pain medicine at the UW School of Medicine and a co-corresponding author on the new study. “It’s sort of a guttural gasping noise, and its uniqueness makes it a good audio biomarker to use to identify if someone is experiencing a cardiac arrest,” Sunshine said.

To develop the new app, the researchers gathered sounds of agonal breathing recorded when bystanders put their smartphones to the mouth of someone who had had a cardiac arrest so that the 911 dispatcher could determine whether the patient needed immediate CPR. 

The researchers also collected sound recorded during sleep studies to obtain typical sounds people make in their sleep, such as snoring and irregular breathing due to obstructive sleep apnea.

From these recordings, the team used machine learning to create an application that could detect agonal breathing 97% of the time when the smart device was placed up to 6 meters away from a speaker generating the sounds.

Next the team tuned the algorithm so that the application wouldn’t accidentally classify a different type of breathing, like snoring, as agonal breathing.

“We played these examples at different distances to simulate what it would sound like if it the patient was at different places in the bedroom,” said first author Justin Chan, a doctoral student in the Allen School. “We also added different interfering sounds such as sounds of cats and dogs, cars honking, air conditioning, things that you might normally hear in a home.”

In their paper, the researchers report that the application was indeed able to identify agonal breathing 97 percent of the time whilst almost never incorrectly identified snoring and other common breathing sounds as agonal. 

“Right now, this is a good proof of concept using the 911 calls in the Seattle metropolitan area,” said Shyam Gollakota, an associate professor in the UW’s Paul G. Allen School of Computer Science & Engineering and another co-corresponding author on the paper. “But we need to get access to more 911 calls related to cardiac arrest so that we can improve the accuracy of the algorithm further and ensure that it generalizes across a larger population.”

“Cardiac arrests are a very common way for people to die, and right now many of them can go unwitnessed,” Sunshine said. “Part of what makes this technology so compelling is that it could help us catch more patients in time for them to be treated.”

The researchers plan to commercialize this technology through a UW spinout, Sound Life Sciences Inc.

Dr. Thomas Rea, a professor of general internal medicine at the UW School of Medicine and medical director of King County Medic One, was also a co-author on this paper. This research was funded by the National Science Foundation.