Novels like “Neuromancer” made it seem like we were just years away from creating a functional brain-computer interface (BCI) that would let us enter a shared virtual reality. But the latest phase of a study that translates attempts at conversation from a speech-impaired, paralyzed patient into words on a screen shows just how far we have to go before making a neural connection with computers. “Trying to get the computer program to decipher the intended movement based solely on signals recorded from the cortex is like you or me trying to piece together the meaning of a sentence that is missing many important words,” Edelle Field-Fote, director of spinal cord injury research at Shepherd Center, told Lifewire in an email interview. “Sometimes we will correctly guess the missing words based on the context, and other times we will not.”
Reading Thoughts
The latest phase of the years-long Facebook-funded study from the University of California San Francisco (UCSF), called Chang Labs, recently announced progress in trying to read the thoughts of a paralyzed patient. The study, run by neurosurgeon Dr. Edward Chang, involved implanting electrodes in a paralyzed man who had a brainstem stroke. With an electrode patch implanted over the area of the brain associated with controlling the vocal tract, the man tried to answer questions displayed on a screen. The study’s machine learning algorithms were able to recognize 50 words and convert these into real-time sentences. “To our knowledge, this is the first successful demonstration of direct decoding of full words from the brain activity of someone who is paralyzed and cannot speak,” said Chang in a news release. Researchers have high hopes that such research eventually could translate into practical benefits for patients. “The ability to capture signals from the brain means that the information can be processed by the computer and used to control devices,” Field-Fote said. “These devices can be used by individuals who, because of injury or health disorder, have lost the linkage between the brain and the muscles, whether it’s muscles that control speech, arms, or legs.”
A Tesla for Your Brain?
Elon Musk’s Neuralink company has been making advances in BCIs. Researchers have developed sophisticated automated surgical robots to implant one or more BCI beneath the skull of, to date, pigs and monkeys, with no apparent adverse medical impact. Matt Lewis, a research director at the security company NCC Group, told Lifewire in an email interview that this includes successful extraction of BCIs to show the process can be safely reversed. Neuralink’s monkeys have also learned to play the video game Pong simply through thought, with significant effect and accuracy. Beyond supporting those with disabilities, there is growing interest in using BCIs to enhance activities such as thinking text rather than typing, which, under the right conditions, can be much quicker than typing, Lewis said. “There are also a myriad of other interesting applications such as the use of thought in video games (rather than having to use a controller),” he added. “And where two users have BCI in proximity, the ability to be able to simulate a form of telepathy, whereby users communicate with each other simply through thought and use of BCI encoding and decoding of those thoughts.” Chang said the trial would be expanded to include more participants affected by severe paralysis and communication deficits. The team is currently working to increase the number of words in the available vocabulary and improve the rate of speech. But the acceleration of BCI goes hand-in-hand with machine learning, Lewis said. “The BCI needs to train and learn brain activity, per user, to understand what parts of the brain and what types of activity correlate with specific thoughts and actions,” he added. “Users will need to train an application before it matches with their expectations.”