Scientists used machines to monitor brain signals and then computer algorithms to pick out which song was being heard. The study is the latest in a growing number of projects to decode human brain waves using computers. Efforts to interpret brain waves are coming close to fruition, experts say. “Are we capable of decoding neural representations in a way that is of practical value for humans?” Harvard neurology researcher Richard Hakim said in a phone interview. “The answer is we’re kind of there.”
Listening in the Dark
In a recent study, Derek Lomas at Delft University of Technology in the Netherlands and his colleagues asked 20 people to listen to 12 songs using headphones. The room was darkened, and the volunteers were blindfolded. Each participant was monitored with an electroencephalography (EEG) machine. An EEG is an instrument that can noninvasively pick up the electrical activity on their scalp as they listen to the songs. “The performance observed gives appropriate implication towards the notion that listening to a song creates specific patterns in the brain, and these patterns vary from person to person,” the paper’s authors wrote. An artificial neural network reportedly was trained to identify the connections between the brain wave data and the music. The neural network was nearly 85% accurate in predicting what song was being played. However, Hakim said that the EEG machine used in the study is too blunt an instrument to be useful in interpreting much about the brain. The EEG is placed outside the head. “The problem is that so far away from the brain that there’s a lot of stuff in between, and it’s really fuzzy,” he added. “It’s sort of like going to a soccer arena and listening to what the crowd is yelling. You know very roughly where things are happening, but not what they are talking about.” A more accurate way to measure brain activity is by sticking probes into the skull, Hakim said. However, understandably, not many people sign up for this kind of experiment. “I mostly work on mice,” he added.
Elon Wants to Neuralink You Up
The music study is only one of many recent efforts to understand what people are thinking using computers. The research could lead to technology that one day would help people with disabilities manipulate objects using their minds. For example, Elon Musk’s Neuralink project aims to produce a neural implant that allows you to carry a computer wherever you go. Tiny threads are inserted into areas of the brain that control movement. Each thread contains many electrodes and is connected to an implanted computer. “The initial goal of our technology will be to help people with paralysis to regain independence through the control of computers and mobile devices,” according to the project’s website. “Our devices are designed to give people the ability to communicate more easily via text or speech synthesis, to follow their curiosity on the web, or to express their creativity through photography, art, or writing apps.” Brain-machine interfaces might even one day help make video games more realistic. Gabe Newell, the co-founder and president of video game giant Valve, said recently that his company is trying to connect human brains to computers. The company is working to develop open-source brain-computer interface software, he said. One possible use for the technology would be to let people be more connected to gaming software. Newell also suggested that interfaces could be used to control human bodily functions like sleep. These are exciting times in the human-machine interface field. I often feel that a computer hooked up to my brain would come in handy. Please make mine noninvasive, though.