Brain-computer interface plays music based on person's mood
Scientists are developing a brain-computer interface (BCI) that recognises a person’s affective state and plays music to them based on their mood.
The duo from the universities of Reading and Plymouth believe the system could be used as a therapeutic aid for people suffering with certain forms of depression.
Dr Slawomir Nasuto, project leader at Reading University, told The Engineer: ‘When we perform some cognitive functions our brain generates lots of electrical activity, which can be recognised as fluctuations of tiny electrical potentials using non-invasive techniques.
‘If you can record these fluctuations and recognise what kind of activity is going on, a control command for a computer… could be provided.’
Traditionally, the user has had complete control over how a BCI system responds.
Nasuto said: ‘In our case, we are not asking the subject to be happy or sad. We want to recognise the subject’s state so we can provide the right stimulus. The subject is not in control and this is a very unique feature.’
He added that the project would use an electroencephalograph (EEG) to transfer the electrical signal from the patient’s scalp via a series of wires to an amplifier box, which, in turn, would be connected to a computer.
The computer would then generate its own synthetic music based on the user’s mental state.
Plymouth University’s Eduardo Miranda, a professor in computer music, said: ‘We have developed a number of rule-based approaches to generate music with computers.
‘We will use computer software to try to identify rules governing musical patterns that produce certain emotions. Then we would embed these rules into the system to generate the music,’ he explained.
The pair said they will attempt to tailor the system to suit an individual’s music taste but stated that the music produced will sound like it comes from a piano.
‘We expect this customisation to be one of the main challenges because people have so many different musical tastes and what makes one person excited might make another very bored,’ said Nasuto.
Steve Levine, a record producer and chairman of The Music Producers Guild, and owner of Magnum Opus, believes that the idea could be used to treat certain forms of depression.
He said: ‘We know that 120 beats per minute is a particularly important tempo, not least because it relates to the tempo of the heart and can therefore have a profound affect. We also know that the key of “C” has a certain effect on people because it’s even tempered.’
Levine explained that a minor key evokes sadness whereas a major key evokes happiness and that musicians have played on this fact for hundreds of years.
The four-year, EPSRC-funded year project is set to start early next year and Nasuto said that several clinicians have already expressed an interest in testing their system.