Speech prosthetic turns thoughts into words

A speech prosthetic developed by engineers, neuroscientists, and neurosurgeons translates a person’s brain signals into what they are trying to say.

Compared to current speech prosthetics with 128 electrodes (left), Duke engineers have developed a new device that accommodates twice as many sensors in a significantly smaller footprint - Dan Vahaba/Duke University

This is the claim of a team at Duke University in the US who believe the device could eventually help people unable to talk due to neurological disorders regain the ability to communicate through a brain-computer interface. The work is detailed in Nature Communications.

“There are many patients who suffer from debilitating motor disorders, like ALS [amyotrophic lateral sclerosis] or locked-in syndrome, that can impair their ability to speak,” said Gregory Cogan, PhD, a professor of neurology at Duke University’s School of Medicine and one of the lead researchers involved in the project. “But the current tools available to allow them to communicate are generally very slow and cumbersome.”

A current neuroprostheses decodes vocabulary at about 78 words per minute, but people tend to speak around 150 words per minute.

The lag between spoken and decoded speech rates is partially due to the relatively few brain activity sensors that can be fused onto a paper-thin piece of material that lays on the surface of the brain. Fewer sensors provide less decipherable information to decode.

Register now to continue reading

Thanks for visiting The Engineer. You’ve now reached your monthly limit of news stories. Register for free to unlock unlimited access to all of our news coverage, as well as premium content including opinion, in-depth features and special reports.  

Benefits of registering

  • In-depth insights and coverage of key emerging trends

  • Unrestricted access to special reports throughout the year

  • Daily technology news delivered straight to your inbox