Mind-reading AI turns thoughts into text
Dec. 12, 2023.
2 min. read Interactions
AI-based portable, non-invasive, mind-reading system can translate EEG signals into text by learning from large quantities of EEG data
Researchers at the University of Technology Sydney (UTS) have developed a portable, non-invasive system that can decode silent thoughts and turn them into text.
The technology could aid communication for people who are unable to speak due to illness or injury, including stroke or paralysis. It could also enable seamless communication between humans and machines, such as operating a bionic arm or robot.
The research at the GrapheneX-UTS Human-centric Artificial Intelligence Centre was led by Distinguished Professor CT Lin, Director of the GrapheneX-UTS HAI Centre
AI learning process
In the study, participants silently read passages of text while wearing a cap that recorded electrical brain activity through their scalp, using an electroencephalogram (EEG). An AI model called DeWave, developed by the researchers, translates these EEG signals into words and sentences .
UNIVERSITY OF TECHNOLOGY SYDNEY MEETING Neural Information Processing Systems, FUNDER; Australian Research Council, GrapheneX
“It is the first to incorporate discrete encoding techniques in the brain-to-text translation process, introducing an innovative approach to neural decoding,” Lin said.
No need for invasive Neuralink-type implants or an MRI machine
He also said the new system eliminates the need to translate brain signals to language, which has required surgery to implant electrodes in the brain, such as Elon Musk’s Neuralink, or scanning in an MRI machine, which is large, expensive, and difficult to use in daily life.
The UTS research was carried out with 29 participants, so it’s likely to be more robust and adaptable than previous decoding technology that has only been tested on one or two individuals at a time.
Accuracy to be improved
The translation accuracy score is currently around 40% on BLEU-1 (a number between zero and one that measures the similarity of the machine-translated text to a set of high-quality reference translations). The researchers hope to see this improve to a level that is comparable to traditional language translation or speech recognition programs, which is closer to 90%.
The research follows on from previous brain-computer interface technology developed by UTS in association with the Australian Defence Force. It uses brainwaves to command a quadruped robot, demonstrated in this ADF video.
A paper will be presented at the Dec. 12 at the NeurIPS conference, to be held in New Orleans on December 12, 2023, and will be linked here.