قالب وردپرس درنا توس
Home https://server7.kproxy.com/servlet/redirect.srv/sruj/smyrwpoii/p2/ Health https://server7.kproxy.com/servlet/redirect.srv/sruj/smyrwpoii/p2/ Experimental brain-controlled hearing aid can pick out voices in a crowd

Experimental brain-controlled hearing aid can pick out voices in a crowd



B uzz buzz secret hmmm hmmm don't tell anyone garble garble layoffs

The brain is unsurpassed in its ability to Pick out juicy tidbits and attention-grabbing voices against a cacophony of background noise. Hearing aids, however, stink at this "cocktail party effect": Rather than amplifying a particular voice by selective attention, they amplify every sound equally.

On Wednesday, researchers unveiled a possible solution – an experimental hearing aid that reads the mind . It uses artificial intelligence to separate the sounds of different speakers, detects brain activity that makes one of those voices stand out from the others, and amplifies only that voice before delivering the sound to the listener, they explained in Science Advances.

If the technology proves practical ̵

1; and for that it probably cannot require implantation electrodes on the surface of the brain, as the current version does – it could serve as the basis for a brain-controlled hearing aid that would let people with hearing loss function better in social settings as well as in the noisy world.

The project, led by electrical engineer Nima Mesgarani of Columbia University's Zuckerman Mind Brain Behavior Institute, is one of many trying to make hearing aids more like normal hearing . The $ 500 Bose Hearphone app for smartphones, has directional mics so users can hear one person better than another, plus controls to steam, say, traffic noise. But no current device can amplify selected conversations from multiple sources in a crowd, as the normally hearing brain can.

"Even the most advanced digital hearing aids don't know which voices they should suppress and which they should amplify," Mesgarani said

If they did, it would make a major difference to people with impaired hearing, said Roger Miller, who directs the neural prosthetics program at the National Institute on Deafness and Other Communication Disorders, which funded the study. "There is real gold to be mined in that hill," he said.

Mesgarani started his mining in the brain. He and his graduate announced in 2012 that when people converse, the listener's brain waves echo the acoustic features of the speaker's voice, turning up its perceived volume and filtering out extraneous voices.

That ability comes from the brain's secondary auditory cortexes, one behind each ear. They amplify one voice over others by the simple means of paying attention, in a process called top-down control. ("Top" means an executive function such as conscious attention; "down" means a sensory function, in this case hearing.) The sound of a familiar voice becomes a familiar word (one's name) and an emotionally resonant (). divorce ) or tone, or other attention grabber causes this region to increase the volume of what grabs its attention

The brain-controlled hearing aid first separates the audio signals from different speakers. It then determines the spectrogram, or voiceprint, of each, meaning how a voice's volume and frequency vary with time. Next, it detects the brain waves in a listener's auditory cortex (via an implanted 16-by-16 electrode array), which indicates what voice the listener is paying attention to. Finally, the system searches for that particular voice and amplifies it, and only it. When the listener's attention turns to a different voice, the system is the first one and dials up the volume of the new one.

Three patients with epilepsy who were undergoing brain surgery volunteered to let Dr. Ashesh Mehta of the Northwell Health Institute for Neurology and Neurosurgery on New York's Long Island implant electrode array in their brains. The electrodes detected brain activity that occurred when the participants listened to either of two speakers talking at once, focusing first on one and then on the other, as directed by the scientists. The scientists detected the unique brain activity corresponding to paying attention to each voice.

"The brain waves of listeners only tracked the voice of the speaker they're focusing on," Mesgarani said.

This research is another in a growing list of studies that tap the brain's activity in order to produce output that the body can otherwise not manage, such as a paralyzed person moving a mechanical arm or someone with ALS turning thoughts into speech. The mind-reading hearing aid would have to work via electrodes on the scalp. The Columbia team is working on the scalp version, as well as one with electrodes around the ear.

Their earlier mind-reading hearing aid has only been trained to recognize, such as those of family members. It could detect and amplify those voices but not unknown ones. The next-gen device “can recognize and decode a voice – any voice – right off the bat,” Mesgarani said.

if ("object" === typeof mc4wp && mc4wp.forms) { mc4wp.forms.on ('subscribed', function () { // Successful MC4WP newsletter signup AJAX form submission. fbq ('track', 'lead'); }); } });
Source link