Researchers at Glasgow University have been able to identify how the brain encodes the visual information that enables human beings to recognise faces and scenes.
Brainwaves — the patterns of electrical activity created in the brain when it is engaged in different activities — can easily be measured using electroencephalography (EEG).
But, according to Prof Philippe Schyns, director of Glasgow University’s Institute of Neurosciences and Psychology, while it was previously possible to detect EEG activity in certain areas of the brain when particular tasks were performed, until now it has been impossible to know what information was carried in those brainwaves.
In order to decode some of these brainwaves, the scientists at Glasgow recruited six volunteers and presented them with images of people’s faces, displaying different emotions such as happiness, fear and surprise.
On different experimental trials, parts of the images were randomly covered so that, for example, only the eyes or mouth were visible. The volunteers were then asked to identify the emotion being displayed.
While engaged in this exercise, the participants’ brainwaves were measured using EEG, which allowed the researchers to identify which parts of the brain were active when looking at different parts of the face.
In the trial, the researchers found that ’beta’ waves, which have a cycle of 12Hz, carried information about the eyes, while ’theta’ waves at 4Hz encoded information about the mouth. The researchers also found that information could be encoded depending on the phase or timing of the brainwave and less so by its amplitude or strength.
The research ties in with an initiative unique to Glasgow, developed by Prof Philippe Schyns, Prof Joachim Gross and Dr Gregor Thut at the Centre for Cognitive Neuroimaging (CCNi), combining magnetoencephalography (MEG), Transcranial Magnetic Stimulation (TMS) and statistical information mapping to understand how the oscillatory networks of the brain can be modelled and interacted with to enhance or suppress visual perception.
The research was funded by the Biotechnology and Biological Sciences Research Council, the ESRC and the MRC.