For people without the use of their limbs, interacting with technology is vital, but difficult. Voice control can solve some problems, but for some people even that is not possible. Researchers at Georgia Institute of Technology now believe they have taken a major step towards the science fiction goal of being able to control machinery and electronics directly just by thinking about it.
Brain-machine interface (BMI) control is already available, to a degree. Neurologists have devised methods for using electroencephalography (EEG) signals to control external devices, but this has required extremely awkward equipment: electrode-studded caps, sheafs of wires, messy adhesives and liquids to ensure skin-to-centre contact. The Georgia team, who describe their research in Nature Machine Intelligence, have devised a replacement for all this in the form of a high-resolution EEG monitoring system within a miniaturised flexible printed sensor.
- Researchers demonstrate direct, real-time brain control of cursor
- US team controls robot with brainwaves and hand gestures
- Students use brainwaves to control drone flight
Principal researcher Woon-Hong Yeo and his colleagues put together three essential components to form their system: highly flexible, hair-mounted electrodes that make direct contact with the scalp, an ultra-thin nano membrane electrode, and soft, flexible circuitry with a Bluetooth telemetry unit. The recorded EEG signals from the brain are processed within the flexible circuitry, and then wirelessly transmitted via Bluetooth to a tablet computer up to 15 m away.
One set of challenges the team had to deal with is the very low signal amplitude that can be detected by the electrodes, and the high degree of variation between human brains. Detecting the signals and interpreting them correctly is vital to understanding what the user wants the system to do. In order to do this, the team turned to AI. "Deep learning methods, commonly used to classify pictures of everyday things such as cats and dogs, are used to analyse the EEG signals," explained Chee-Siang Ang, a collaborator on the project from the University of Kent. "Like pictures of a dog which can have a lot of variations, EEG signals have the same challenge of high variability. Deep learning methods have proven to work well with pictures, and we show that they work very well with EEG signals as well."
The same deep-learning systems were also effective for identifying which electrodes were the most useful for gathering information to classify EEG signals, Ang added. "We found that the model is able to identify the relevant locations in the brain for BMI, which is in agreement with human experts. This reduces the number of sensors we need, cutting cost and improving portability."
The system uses three elastomeric scalp electrodes held onto the head with a fabric band, ultra-thin wireless electrodes stuck to the back of the neck, and a skin-like printed electrode stuck below one ear. "Typical EEG systems must cover the majority of the scalp to get signals, but potential users may be sensitive about wearing them," said Yeo. "This miniaturized, wearable soft device is fully integrated and designed to be comfortable for long-term use." Tests with six human subjects (none of whom were disabled) showed the system could control an electric wheelchair, a small robotic vehicle and a display system without the need for a mouse or joystick. "Future study would focus on investigation of fully elastomeric, wireless self-adhesive electrodes that can be mounted on the hairy scalp without any support from headgear, along with further miniaturization of the electronics to incorporate more electrodes for use with other studies," Yeo said.
The system could also be used for other research where EEG is needed but can be awkward to implement, such as sleep studies. Yeo is working with colleagues at Georgia Tech on such studies.