Facebook-funded mind-reading tech predicts what you want to say before you say it — a breakthrough that could help people with strokes and motor neurone disease communicate
Experts scanned participants’ brains as they listened to and answered questionsThey used this to create a decoder that can interpret speech from brain activityBoth heard and actively produced speech can be decoded from cortical scansThe device could one day help people who cannot speak due to illness or injury
A new mind reading gadget can interpret brain activity and predict what you want to say before you’ve even uttered a word — and even produce a transcript in real time.
Facebook-funded experts used brain scans of people listening to and answering questions to train a system to decode speech from the corresponding brain activity.
Dubbed the neural decoder, once refined it could be used by people who cannot speak or their own due to illness or injury.
Scroll down for video
A mind reading gadget that can interpret brain activity can predict what you want to say before you’ve even uttered a word — and produce a transcript in real time
Facebook-funded experts used brain scans of people listening to and answering questions to train a system to decode speech from the corresponding brain activity
Neurosurgeon Edward Chang of the University of California San Francisco and colleagues recorded cortical activity in the brains of three patients who had been undergoing treatment for epilepsy.
The subjects each listened to a series of questions and responded verbally using a series of previously established set answers.
In this way, researchers were able to gather data on brain activity corresponding to both perceived and produced speech.
This data was then used to train a system that is capable of detecting and decoding speech from brain scans.
In a subsequent test, the participants were asked to listen to a series of questions and respond out loud with an answer of their own choice.
The researchers took further cortical scans during this process.
From these alone, the the researchers were able to use the brain-decoding model they had previously developed to not only detect when the participants were either listening or speaking, but also to predict what was being heard or said.
Ultimately, Dr Chang and his colleagues were able to decode both spoken and heard speech with 61 and 76 per cent accuracy, respectively.
The team found that they could improve the accuracy of their decoded answer based on their translation of scans corresponding to the initial question — because certain answers were only valid in response to specific questions.
Dubbed the neural decoder, the tech, once refined, could be used by people who cannot speak or their own due to illness or injury. Pictured: a Facebook prototype brain scanning helmet that was not used in the present study
This is not the first study to show that speech-related brain activity in specific areas of the cortex can be decoded — but this is the first study to tackle the interpretation of both listening and speaking tasks at the same time.
The researchers hope that, in the future, mind-reading devices based on their tech might be used to aid communication with people who cannot speak on their own thanks to illness or injury.
To do so, however, it will need to be proven that the same principle can be applied to decode answers from purely imagined — rather than actually produced — speech.
The full findings of the study — which was funded by a research contract under Facebook’s Sponsored Academic Research Agreement — were published in the journal Nature Communications.
HUMAN BRAIN WILL CONNECT TO COMPUTERS ‘WITHIN DECADES’
In a new paper published in the Frontiers in Neuroscience, researchers embarked on an international collaboration that predicts groundbreaking developments in the world of ‘Human Brain/Cloud Interface’s’ within the next several decades.
Using a combination of nanotechnology, artificial intelligence, and other more traditional computing, researchers say humans will be able to seamlessly connect their brains to a cloud of computer(s) to glean information from the internet in real-time.
According to Robert Freitas Jr., senior author of the research, a fleet of nanobots embedded in our brains would act as liaisons to humans’ minds and supercomputers, to enable ‘matrix style’ downloading of information.
‘These devices would navigate the human vasculature, cross the blood-brain barrier, and precisely autoposition themselves among, or even within brain cells,’ explains Freitas.
‘They would then wirelessly transmit encoded information to and from a cloud-based supercomputer network for real-time brain-state monitoring and data extraction.’
The interfaces wouldn’t just stop at linking humans and computers, say researchers. A network of brains could also help form what they call a ‘global superbrain’ that would allow for collective thought.