Brain-Machine Interface Device Predicts Internal Speech

Summary: A new brain-machine interface is the most accurate to date at predicting a person’s internal monologue. The technology could be used to assist those with disorders affecting speech to effectively communicate.

Source: CalTech

New Caltech research is showing how devices implanted into people’s brains, called brain-machine interfaces (BMIs), could one day help patients who have lost their ability to speak.

In a new study presented at the 2022 Society for Neuroscience conference in San Diego, the researchers demonstrated that they could use a BMI to accurately predict which words a tetraplegic participant was simply thinking and not speaking or miming.

“You may already have seen videos of people with tetraplegia using BMIs to control robotic arms and hands, for example to grab a bottle and to drink from it or to eat a piece of chocolate,” says Sarah Wandelt, a Caltech graduate student in the lab of Richard Andersen, James G. Boswell Professor of Neuroscience and director of the Tianqiao and Chrissy Chen Brain-Machine Interface Center at Caltech.

“These new results are promising in the areas of language and communication. We used a BMI to reconstruct speech,” says Wandelt, who presented the results at the conference on November 13.

Previous studies have had some success at predicting participants’ speech by analyzing brain signals recorded from motor areas when a participant whispered or mimed words. But predicting what somebody is thinking, internal dialogue, is much more difficult, as it does not involve any movement, explains Wandelt.

“In the past, algorithms that tried to predict internal speech have only been able to predict three or four words and with low accuracy or not in real time,” Wandelt says.

The new research is the most accurate yet at predicting internal words. In this case, brain signals were recorded from single neurons in a brain area called the supramarginal gyrus, located in the posterior parietal cortex. The researchers had found in a previous study that this brain area represents spoken words.

Now, the team has extended its findings to internal speech. In the study, the researchers first trained the BMI device to recognize the brain patterns produced when certain words were spoken internally, or thought, by the tetraplegic participant. This training period took about 15 minutes.

This shows a brain on a computer
Previous studies have had some success at predicting participants’ speech by analyzing brain signals recorded from motor areas when a participant whispered or mimed words. Image is in the public domain

They then flashed a word on a screen and asked the participant to say the word internally. The results showed that the BMI algorithms were able to predict eight words with an accuracy up to 91 percent.

The work is still preliminary but could help patients with brain injuries, paralysis, or diseases such as amyotrophic lateral sclerosis (ALS) that affect speech.

“Neurological disorders can lead to complete paralysis of voluntary muscles, resulting in patients being unable to speak or move, but they are still able to think and reason. For that population, an internal speech BMI would be incredibly helpful,” Wandelt says.

“We have previously shown that we can decode imagined hand shapes for grasping from the human supramarginal gyrus,” says Andersen. “Being able to also decode speech from this area suggests that one implant can recover two important human abilities: grasping and speech.”

The researchers also point out that the BMIs cannot be used to read people’s minds; the device would need to be trained in each person’s brain separately, and they only work when a person focuses on the word.

Other Caltech study authors besides Wandelt and Andersen include David Bjanes, Kelsie Pejsa, Brian Lee, and Charles Liu. Lee and Liu are Caltech visiting associates who are on the faculty of the Keck School of Medicine at USC.

About this neurotech research news

Author: Whitney Clavin
Source: CalTech
Contact: Whitney Clavin – CalTech
Image: The image is in the public domain

Original Research: Closed access.
Online internal speech decoding from single neurons in a human participant” by Sarah Wandelt et al. MedRxiv


Abstract

Online internal speech decoding from single neurons in a human participant

Speech brain-machine interfaces (BMI’s) translate brain signals into words or audio outputs, enabling communication for people having lost their speech abilities due to diseases or injury.

While important advances in vocalized, attempted, and mimed speech decoding have been achieved, results for internal speech decoding are sparse, and have yet to achieve high functionality. Notably, it is still unclear from which brain areas internal speech can be decoded.

In this work, a tetraplegic participant with implanted microelectrode arrays located in the supramarginal gyrus (SMG) and primary somatosensory cortex (S1) performed internal and vocalized speech of six words and two pseudowords.

We found robust internal speech decoding from SMG single neuron activity, achieving up to 91% classification accuracy during an online task (chance level 12.5%).

Evidence of shared neural representations between internal speech, word reading, and vocalized speech processes were found. SMG represented words in different languages (English/ Spanish) as well as pseudowords, providing evidence for phonetic encoding.

Furthermore, our decoder achieved high classification with multiple internal speech strategies (auditory imagination/ visual imagination). Activity in S1 was modulated by vocalized but not internal speech, suggesting no articulator movements of the vocal tract occurred during internal speech production.

This works represents the first proof-of-concept for a high-performance internal speech BMI.

Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.