HomeNewsThe brain implant converts thoughts into words to help a paralyzed man...

The brain implant converts thoughts into words to help a paralyzed man ‘speak’ again

ucsf-chang-patient-setup-illustration
UCSF’s brain-computer interface is surgically applied directly to the patient’s motor cortex to initiate communication.

 

Ken Probst, UCSF

Facebook’s work on neural input technology for AR and VR seems to be moving in a more wrist – oriented direction, but the company continues to invest in research on brain-computer interfaces. The latest phase of the Facebook-funded study from UCSF, called Project Steno, translates conversational attempts from a speech-impaired paralyzed patient into on-screen words.

Dr David Moses, lead author of a study published Wednesday in the New England Journal of Medicine, said, “Words that someone is naturally trying to say can be decoded from brain activity.” “Hopefully, this is proof of the principle of direct speech control of a communication device.

Brain-computer interfaces (BCIs) have been behind many recent breakthroughs, including Stanford research that can turn brainwashed handwriting into predictive text. The study of UCSF takes a different approach, analyzing actual efforts in speech and acting almost like a translator.

In this study, conducted by UCSF neurosurgeon Dr. Edward Chang, a person with paralysis was fitted with a “neuroprosthesis” of electrodes with a brain system stroke at the age of 20 years. , The man tried to respond to the questions displayed on the screen. UCSF’s machine learning algorithms can detect up to 50 words and convert them into real-time sentences. For example, the patient asks “How are you today?” “I’m very good” appeared on the response screen.

Moses clarified that the work aims to go beyond Facebook’s funding phase and that research still has a lot more work to do. It is currently unclear how much speech recognition comes from recorded patterns of brain activity, or a combination of vocalizations or both.

Moses clarified that the study did not read as well as other BCI tasks: it relies on perceiving brain activity that occurs specifically when attempting to engage in a specific behavior, such as speaking. Moses said the work of the UCSF team has not yet been translated into non-invasive neural interfaces. Elon Musk’s Neuralink promises wireless transmission data from brain-fitted electrodes for future research and auxiliary use, but so far Tech has only performed on one monkey.

frlr-head-mounted-bci-research-prototype
The BCI head-wearing device prototype of Facebook reality labs that do not have fitted electrodes is going to open source.

 

Facebook

Meanwhile, Facebook Reality Labs Research has moved away from head-wearing brain-computer interfaces for future VR / AR headsets, and in the near future will focus on wrist-wearing devices based on tech from CTRL-Labs. Facebook Reality Labs has its own non-invasive prototype head-worn research headsets for studying brain activity, and the company has announced that it will make them available for open-source research projects focusing on head-mounted neural hardware. (UCSF is receiving funding from Facebook but no hardware.)

“The angles of optical head mounted work apply to our EMG research at the wrist. We use optical BCI as a research tool to develop better wrist-based sensor models and algorithms. That’s one reason we share our head-mounted hardware prototypes with other researchers, so they can apply our discovery to other utility cases, “a Facebook spokesperson confirmed via email.

However, user-targeted neural input technology is still in its infancy. Although there are consumer devices that use noninvasive head or wrist-wear sensors, they are much less accurate than currently fitted electrodes.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read