Joy without joysticks

University of Washington researchers have developed software that allows users to control a computer using their voice.

The so-called "Vocal Joystick" software detects sounds 100 times a second and instantaneously turns that sound into movement on the screen. Different vowel sounds dictate the direction: "ah," "ee," "aw" and "oo" and other sounds move the cursor one of eight directions. Users can transition smoothly from one vowel to another, and louder sounds make the cursor move faster. The sounds "k" and "ch" simulate clicking and releasing the mouse buttons.

Versions of Vocal Joystick exist for browsing the Web, drawing on a screen, controlling a cursor and playing a video game. A version also exists for operating a robotic arm. Jeffrey Bilmes, a UW associate professor of electrical engineering, also believes the technology could be used to control an electronic wheelchair.

Existing substitutes for the handheld mouse include eye trackers, sip-and-puff devices and head-tracking systems. Each technology has drawbacks. Eye-tracking devices are expensive and require that the eye simultaneously take in information and control the cursor, which can cause confusion. Sip-and-puff joysticks held in the mouth must be spit out if the user wants to speak, and can be tiring. Head-tracking devices require neck movement and expensive hardware.

Register now to continue reading

Thanks for visiting The Engineer. You’ve now reached your monthly limit of news stories. Register for free to unlock unlimited access to all of our news coverage, as well as premium content including opinion, in-depth features and special reports.  

Benefits of registering

  • In-depth insights and coverage of key emerging trends

  • Unrestricted access to special reports throughout the year

  • Daily technology news delivered straight to your inbox