Suit allows users to create music through movement

A UK team has developed a musical suit that allows users to create and manipulate sounds through the movement of their bodies.

The design builds on the team’s concept of musical gloves developed for and first demonstrated by singer Imogen Heap last year, but this version uses purpose-built technology to give her much greater control over the music she creates.

Heap unveiled the suit, which covers the hands, arms and upper torso and includes LED lights and haptic technology to provide feedback, at the TED Global 2012 conference in Edinburgh last weekend.

The suit was developed by Heap and a team of electronic, software and sound engineers, together with a fashion designer and artist, from Bristol University, the University of the West of England (UWE) and Queen Mary, University of London.

Team member and Bristol PhD student Seb Madgwick, who originally developed the suit’s sensors and algorithms for medical research, said the change from gloves to suit was like moving from a few controls to a whole production desk.

‘But I’d add that you can do things you can’t do on a desk, where you’ve only got two hands and can only control two dials at once,’ he told The Engineer.

‘Here with all the mapping and the toolbox you’ve got, you can be playing an instrument at the same time as coupling together many aspects of the production side of things and controlling it in real time.’

The suit uses sensors known as inertial measurement units (IMUs), which combine a gyroscope, accelerometer and magnetometer and are conventionally used to manoeuvre aircraft and spacecraft, to map the exact position, orientation, movement and speed of the wearer’s body parts in a similar way to motion-capture animation technology.

Microphones on the wrists capture sounds that can then be manipulated with different movements that correspond to different production effects or additional sounds in the software’s toolbox, as well as the volume and stereo position of each sound.

Sensor signals run through a central processor on the user’s back that is connected wirelessly to a nearby computer, running several pieces of music production software to convert the movements into sounds in real time.

Madgwick said the biggest challenge was integrating all the elements of the suit. ‘Hardware wise, there was a distributed network of sensors and actuators for haptic feedback, then overall system integration of having so many pieces of software talking to each other.

‘A particular struggle was the wireless communications because we’re dealing with a performance so it needs robust transfer with low latency. You can’t have [Imogen] hitting a drum and then 100 milliseconds later the sound happening.’

As well as extending the system, the team also replaced the original off-the-shelf gloves with custom-designed components, a more stable and efficient software system and more durable fabric.

‘Imogen had an idea and a concept of what she wanted to see but she’s not an engineer and even we didn’t know what was possible, so it was constantly evolving,’ said Madgwick. ‘A huge challenge was to come up with designs that could be flexible.’