For some patients, the “inner voice” may be heard soon

by Andrea
0 comments

For decades, neuroenginos have dreamed of helping people who have been isolated from the world of language.

Diseases such as amyotrophic lateral sclerosis (she) weaken the airway muscles. A stroke can destroy neurons that normally transmit commands to speech. Perhaps, implementing electrodes, scientists can record the brain’s electrical activity and translate it into spoken words.

Now a team of researchers has taken an important step towards this goal. Previously, they managed to decode the signals produced when people tried to speak. In the new study, published this Thursday in the magazine Cellthe computer often hit when participants just imagined saying words.

For some patients, the “inner voice” may be heard soon

Take your business to the next level with the help of the country’s leading entrepreneurs!

Christian Herff, a neuroscientist at the University of Maastricht, the Netherlands, who did not participate in the research, said the result goes beyond the technological aspect and sheds light on the mystery of language. “It’s a fantastic advance,” said Herff.

The new study is the latest result of a long -term clinical trial called Braingate2, which has already had remarkable hits. One of the participants, Casey Harrell, now uses his brain interface to talk to family and friends.

In 2023, after she made her voice unintelligible, Harrell agreed to deploy electrodes in her brain. Surgeons put four sets of small needles on the left side, in a region called the motor cortex, which activates when the brain creates commands for the speech muscles.

Continues after advertising

A computer recorded the electrical activity of implants while Harrell tried to say different words. Over time, and with the help of artificial intelligence, the computer accurately predicted almost 6,000 words, with 97.5% hit. Then he synthesized these words using Harrell’s voice, based on recordings made before her development.

But this success raised a worrying question: could the computer record more than the patients really wanted to say? Could you “hear” your inner voice?

“We wanted to investigate whether there was a risk of the system to decode words that should not be said aloud,” said Erin Kunz, Neuroscientist at Stanford University and author of the study. She and colleagues also wondered if patients would rather use inner speech, as Harrell and other participants were tired when trying to talk; Imagining a sentence could be easier and allow the system to work faster.

Continues after advertising

“If we could decode this, it could avoid physical effort,” said Kunz. “It would be less tiring, allowing prolonged use of the system.”

But it was unclear if the researchers could decode inner speech. In fact, scientists don’t even agree on what “inner speech.”

Harrell reads a screen connected to its brain-computing interface. Neuroscientists now wonder if patients could prefer to use “inner speech” instead of trying to speak aloud. Credit… Ian C. Bates for The New York Times

Our brains produce language, choosing words and organizing them in sentences, using a network of regions the size of a large strawberry.

Continues after advertising

We can use the signs of this network to run the muscles to speak, use sign language or type messages. But many people feel they use language to think – they “hear” their thoughts as an inner voice.

Some researchers argue that language is essential to thought. Others, based on recent studies, claim that much of thought does not involve language and that the inner voice is just a sporadic comment in the mind.

“A lot of people don’t understand when you say you have an inner voice,” said Evelina Fedorenko, MIT’s cognitive neuroscientist. “They think, ‘Maybe you should look for a doctor if you are listening to words on your head.'” (Fedorenko said he has an inner voice, but her husband doesn’t.)

Continues after advertising

Kunz and colleagues decided to investigate the mystery. They gave the participants seven words, such as “kite” and “day”, and compared the brain signs when they tried to say the words and when they just imagined them.

Imagining a word produced a standard of activity similar to that of trying to say it, but with a weaker signal. The computer was able to predict which of the seven words the participants thought. For Harrell, the result was not much better than a random guess, but for another participant, the hit was over 70%.

The researchers have trained the computer specifically for inner speech, improving its performance, including Harrell. When participants imagined entire phrases, like “I don’t know how long you have been here,” the computer decoded most or all words correctly.

Continues after advertising

Herff, who also studies interior speech, was surprised by success. Earlier, he believed that inner speech was fundamentally different from the signs of the motor cortex that produce real speech. “But in this study, they show that for some people it is not that different,” he said.

Kunz stressed that the current performance of the computer with inner speech is not yet enough for conversations. “The results are another proof of concept,” he said.

But it is optimistic that decoding inner speech will be the standard for brain-computing interfaces. In more recent tests, not yet published, accuracy and speed have improved. “We haven’t reached the limit yet,” he said.

As for mental privacy, Kunz and colleagues found reasons for concern: Sometimes the computer detected words that participants did not imagine aloud.

In one test, participants mentally counted colored shapes on the screen. The computer decoded the word to a number, “listening” to the silent count.

“These experiments are the most exciting for me,” said Herff, as they suggest that language may be involved in many types of thought beyond communication. “Some people really think so.”

Kunz and Team have proposed ways to prevent the computer from “listening” to private thoughts. One would be to decode only attempted speech, blocking the interior. The study suggests that this is possible, as both types of thinking are similar, but different enough for the computer to distinguish them.

For those who prefer to use inner speech to communicate, they suggested an “inner password” to activate and disable decoding. The password would be a long and unusual phrase, chosen as “Chitty Chitty Bang Bang”, title of a book and movie.

A 68 -year -old participant with her imagined saying “Chitty Chitty Bang Bang” along with other words. The computer learned to recognize the password with 98.75% precision and only decoded inner speech after detecting it.

“This study is an important ethical advance,” said Cohen Marcus Lionel Brown, Bioeticist at Wollongong University, Australia. “If implemented correctly, it will give patients greater control over what they share and when.”

Fedorenko, who did not participate in the study, called him a “methodological force tour”, but wondered if an implant could “listen” many of our thoughts. She pointed out that the computer performed well when decoding imagined, but worse in open commands, such as thinking of a favorite hobby.

“What they record is mostly garbage,” he said. “I think a lot of spontaneous thinking is not formed in linguistic phrases.”

Source link

You may also like

Our Company

News USA and Northern BC: current events, analysis, and key topics of the day. Stay informed about the most important news and events in the region

Latest News

@2024 – All Right Reserved LNG in Northern BC