World-first tech translates thoughts into text

In a world-first, researchers in Australia have developed a portable, non-invasive system that can decode thoughts and turn them into text.

UTS researcher tests DeWave technology
UTS researcher tests DeWave technology - UTS

The technology could aid communication for people who are unable to speak as well as enabling communication between humans and machines, such as in the operation of a bionic arm or robot.

The research was led by Distinguished Professor CT Lin, director of the GrapheneX-UTS Human-centric Artificial Intelligence Centre at the University of Technology Sydney (UTS), together with first author Yiqun Duan and fellow PhD candidate Jinzhou Zhou from the UTS Faculty of Engineering and IT.

In the study participants silently read passages of text while wearing a cap that recorded electrical brain activity through their scalp using an electroencephalogram (EEG).

The EEG wave is segmented into distinct units that capture specific characteristics and patterns from the human brain. This is done by an AI model called DeWave developed by the researchers that translates EEG signals into words and sentences by learning from large quantities of EEG data. 

“This research represents a pioneering effort in translating raw EEG waves directly into language, marking a significant breakthrough in the field,” Professor Lin said in a statement. “It is the first to incorporate discrete encoding techniques in the brain-to-text translation process, introducing an innovative approach to neural decoding. The integration with large language models is also opening new frontiers in neuroscience and AI.”

According to UTS, previous technology to translate brain signals to language required surgery to implant electrodes in the brain or scanning in an MRI machine.

These methods also struggle to transform brain signals into word-level segments without additional aids such as eye-tracking, which restrict the practical application of these systems. The new technology can be used either with or without eye-tracking.

UTS added that the research was carried out with 29 participants, making it likely to be more robust and adaptable than previous decoding technology that has been tested on one or two subjects, because EEG waves differ between individuals. 

The use of EEG signals received through a cap and not implants makes the signal noisier, but in terms of EEG translation the study surpassed previous benchmarks.

“The model is more adept at matching verbs than nouns. However, when it comes to nouns, we saw a tendency towards synonymous pairs rather than precise translations, such as ‘the man’ instead of ‘the author’,” said Duan. “We think this is because when the brain processes these words, semantically similar words might produce similar brain wave patterns. Despite the challenges, our model yields meaningful results, aligning keywords and forming similar sentence structures.”

The translation accuracy score is currently around 40 per cent on BLEU-1. The BLEU score is a number between zero and one that measures the similarity of the machine-translated text to a set of high-quality reference translations. The researchers hope to see this improve to a level that is comparable to traditional language translation or speech recognition programs, which is closer to 90 per cent.

The research follows on from previous brain-computer interface technology developed by UTS in association with the Australian Defence Force that uses brainwaves to command a quadruped robot.