Imagine an artificial intelligence program that can deduce what you’re hearing by reading your brain activity. Well, you won’t have to imagine for too much longer, as Meta is developing a new program that they hope can be utilized in the near future to assist people who are unable to communicate through speech, gestures, or even typing.
As reported by Jonathan Moens of ScienceNews.org, researchers have so far been able to develop an AI that can predict speech by going on what it believes people are hearing based on brain activity data. The data used only come from a few seconds of activity, but has so far been up to seventy-three percent accurate in the preliminary study according to researchers.
In that report, the goal of this AI program is to even go so far as to assist minimally conscious patients or those who are in vegetative states. As of right now, most technology that aims to assist patients who lack normal ability to communicate comes with risks associated with brain surgeries that have to implant electrodes. But by utilizing accessible data from brain activity, there would be a minimal risk due to the lack of having to implant a device inside the patient’s brain.
As neuroscientist Jean-Rémi King, a Meta AI researcher currently at the École Normale Supérieure in Paris stated to Science News that the program “…could provide a viable path to help patients with communication deficits … without the use of invasive methods.”
The artificial intelligence program was trained on a computational tool in order to detect words and sentences. They used over fifty-six thousand hours of speech recordings from fifty-three languages. This is a great deal of data for a program to go through, but it needed to learn how to recognize features of human speech at multiple levels and contexts, from basic words to entire sentences.
Then by using a pre-existing database of a group of one hundred and sixty-nine volunteers’ brain activity, their brain waves were recorded using magnetoencephalography (MEG) or electroencephalography (EEG). Both of these are able to measure magnetic or electric components of brain activity in a non-invasive way. Between the MEG data and EEG readings, the team found that the AI program performed better than the former.
As the costs of the machines that read MEG data are still quite significant, it’s unlikely that this technology will be brought to the clinic level anytime soon. But this is an interesting step forward that could one day provide a voice to those that find themselves completely unable to communicate with the outside world due to accident or illness.