Interface with ia transforms words only imagined into spoken speech

by Andrea
0 comments
Interface with ia transforms words only imagined into spoken speech

Emory BrainGate Team

Interface with ia transforms words only imagined into spoken speech

Patient with paralysis to use the brain-computing interface developed in the study

A brain-computing interface allowed people with paralysis to directly transform thoughts into words, with less effort than previous techniques, in which a physical attempt was necessary to speak. But we are still very far away from reading thoughts.

People with paralysis can now see their thoughts converted into speech just when imagining Speak mentally.

Although brain-computing interfaces can already decipher the neuronal activity of people with paralysis when they physically try to speak, this process may require a considerable effort.

In fact, scientists have been at least since 2018 to try with AI help – with different approaches and degrees of success.

In 2023, a team of researchers from the University of Texas created a human language decoder in text, in an invasive way, using Magnetic Resonance Readings and an artificial intelligence interpreter similar to ChatgPT.

In a new study, Benyamin Meschede-Krasaresearcher at Stanford University, and his team, have now sought a less demanding approach in terms of brain and simpler energy.

“We wanted to realize if there were similar patterns when someone was Simply imagining talking In your head, ”explains the investigator to.

“E We found that this could be an alternative “And in fact, a more comfortable way – for people with paralysis to use this type of system to restore their communication.”

The results of the were presented in an article published this Thursday in the magazine Cell.

Meschede-krasa and his colleagues recruited four people with severe paralysis, caused by amyotrophic lateral sclerosis (her) or a stroke on the brain trunk. All participants already had, for investigation, Microelectrodes implanted In the motor cortex, a region involved in speech.

The investigators asked each participant that try to pronounce a list of words and phrases, and that also just imagined pronounce them. Found that brain activity was similar in both caseAlthough the signs of activation were usually weaker in the imagined speech.

The team trained an AI model to recognize and decipher these signsusing a vocabulary database with up to 125,000 words.

To Protect privacy from internal discourseprogrammed the AI to unlock only When participants thought about the password Chitty Chitty Bang Bangwhich was detected with 98% accuracy.

Throughout a series of experiments, the team found that imagining the pronunciation of a word allowed the model to decipher it correctly in a74% two cases tea.

This demonstrates a Solid concept proof For this approach, but it is not yet as robust as the interfaces that decipher the tempted speech, says Frank Willettalso researcher at Stanford University and co -author of the study.

“Continuous improvements, both in sensors and AI, in the coming years can increase accuracy,” adds the researcher, senior member of the project team.

Participants expressed a clear preference for this systemfaster and less laborious than those based on attempted speech, says Meschede-Krasa.

The concept follows “an interesting direction”For the future of brain-computing interfaces, Mariska Vansteenselresearcher at UMC Utrecht, in the Netherlands, who was not in the study.

“However, it lacks the ability to differentiate Between tempted speech, what we really want to say and the thoughts we intend to keep private. I am not sure that everyone can distinguish these different concepts of speech and speaking attempted, ”notes Vansteensel.

According to the researcher, the password would have to be activated and deactivatedaccording to the user’s decision to express or not a thought in the midst of a conversation.

“We have to truly guarantee that the sentences emitted through a brain-computing interface are those that the person wants to share with the world-and Not the ones you want to keep for youwhatever happens, ”he says.

Benjamin Alderson-Dayresearcher at the University of Durham, in the United Kingdom, says that There is no reason to consider this system a “mind reader”.

“It works only with Very simple examples of language”He says.“ If your thoughts are limited to isolated words like ‘tree’ or ‘bird’, then you might worry, but still We are too far away to be able to capture free thoughts and more intimate ideas. ”

Source link

You may also like

Our Company

News USA and Northern BC: current events, analysis, and key topics of the day. Stay informed about the most important news and events in the region

Latest News

@2024 – All Right Reserved LNG in Northern BC