Motion sensors to aid stroke victim treatment

Many stroke survivors who have lost the ability to communicate verbally and physically could re-learn subtle gestures people use to indicate words with the help of motion sensing technologies such as those used in the Wii console.

This is the concept behind a £300,000 EPSRC-backed research programme involving experts in human-computer interaction design (HCID) and language communication at City University London. The ultimate goal will be easing the rehabilitation of people with aphasia – a language impairment commonly caused by a stroke that affects around 250,000 people in the UK.

The 18-month project known as Gesture Recognition in Aphasia Therapy (GReAT) – is also receiving support from The Stroke Association.

Dr Julia Galliers, a lecturer in HCID at City University London, said patients who suffer with aphasia have traditionally only been given the option of undergoing costly and time-intensive one-on-one sessions with therapists to learn gestures that are readily interpreted by others.

Galliers said an example of such a gesture may be looking at your wrist to indicate time.

‘That kind of link is lost with people who have aphasia,’ she said, adding that re-learning these gestures can be more complicated for patients with stroke-related disabilities such as one-sided paralysis and cognitive impairment.

The project will create a prototype system that enables users to practise gesturing up to 30 words, receive instant visual or audible feedback and master the movements through repetition.

Galliers said the exact interface is not yet decided, but it will likely take shape as a form similar to a laptop computer. The movements of the users could be tracked through technology like the motion-detection controllers used with the popular Wii game console.

These controllers use accelerometers to measure on multiple axes a user’s direction and movement force. An infrared signal sent to and from the Wii console and remote senses the position of the controller.

The group will also assess technologies such as Microsoft’s forthcoming Kinect for the Xbox 360, which will track users’ movements without the need for a hand-held controller.

Galliers said the entire system will be designed with consultation from five patients who suffer with aphasia to ensure its suitability. The GReAT programme will begin a pilot study in May next year with 10 aphasia patients. A prototype will be ready for demonstration at the end of January 2012.