The device, called Brain-Computer Interface (BCI), developed by researchers at the University of California, in São Francisco (UCSF)
A paralyzed man managed to grab, move, and drop objects with a robotic arm attached to a device that conveys signs of his brain to a computer, performing all these actions simply by imagining them.
A paralyzed man who has a robotic arm got Move the member only with thought.
The device, called Brain-Computer Interface (BCI) and developed by researchers at the University of California, in San Francisco (UCSF), worked for seven months without any adjustments.
So far, these devices have only worked for a day or two.
BCI uses an artificial intelligence model (AI) that adapts to the small changes that occur in the brain when a person repeats a movement (in this case an imagined movement) and learns to do so in a more refined way.
“This combination of human learning is the next phase of these brain-computing interfaces, and it is what we need to achieve a sophisticated and life-like function,” he said Karunesh gangulyneuroscientist at the UCSF Weill Institute of Neuroscience.
The details of the device, funded by the US National Health Institutes (NIH), were this Thursday in the magazine Cell.
The key to this innovation was to learn how brain activity changes from day to day when a patient repeatedly imagines specific movements. When AI was programmed to take these changes into account, It worked for months on end – What was historical.
For months, Ganguly studied the way the standards of brain activity of animals represent specific movements and found that these representations changed from day to day as the animal learned.
Ganguly suspected that he was happening in humans, which is why his BCIs were so quickly losing the ability to recognize these standards.
With this information, your team worked with a patient who was paralyzed after a stroke and unable to speak or move. The man had small sensors implanted on the surface of his brain that could capture brain activity when he imagined moving.
To see if his brain patterns changed over time, Ganguly asked him to imagine to move different parts of his body and, although he could not move, his brain generated signs that BCI registered through the brain sensors.
The team found that the form of representations in the brain remained the same, but its Location changed slightly from day to day.
Until a door opened …
Ganguly asked the participant to imagine making simple movements with his fingers, hands or thumbs over two weeks, while the sensors recorded their brain activity to train AI.
Subsequently, the patient tried to control a robotic arm and hand, but the movements were still not very accurate.
Ganguly made him practice with a virtual robotic arm that feeding him about the accuracy of his views and finally got the virtual arm to do what he wanted.
When the patient began to practice with the royal robotic arm, he only needed a few sessions to transfer your abilities to the real world and get the robotic arm to catch blocks, whether it wheels and move to new places.
As the UCSF shows in a video released on Youtube, the patient managed to open a closet, take a cup and move to a water distributor.
Months later, the patient could still control his robotic arm and only needed a minimal “thin tuning” to adjust the way his movement had changed since he started using the device.
The team is now improving AI models so that the robotic arm moves faster and smoothly to be tested in a home environment.