University of Washington (UW) engineers are developing what is claimed to be the first device to transmit American Sign Language (ASL) over mobile phone networks in the US.
The tool is currently completing its initial field test by participants in a UW summer programme for deaf and hard-of-hearing students.
‘This is the first study of how deaf people in the United States use mobile video phones,’ said project leader Eve Riskin, a UW professor of electrical engineering.
The MobileASL team has been working to optimise compressed video signals for sign language.
By increasing image quality around the face and hands, researchers have brought the data rate down to 30 kilobytes per second while still delivering intelligible sign language.
MobileASL also uses motion detection to identify whether a person is signing or not, in order to extend the phone’s battery life during video use.
Phones including the iPhone 4 and the HTC Evo offer video conferencing but users are said to have encountered problems due to broadband companies blocking high-bandwidth video conferencing from their networks. Similarly, companies are rolling out tiered pricing plans that would charge more to heavy data users.
The UW team estimates that iPhone’s FaceTime video conferencing service uses nearly 10 times the bandwidth of MobileASL. Even after the release of an iPhone app to transmit sign language, people would need to own an iPhone 4 and be in an area with very fast network speeds in order to use the service. The MobileASL system could be integrated with the iPhone 4, the HTC Evo, or any device that has a video camera on the same side as the screen.
Field testing began on 28 July and concludes later this week. In the first two and a half weeks of the study, some 200 calls were made with an average call duration of a minute and a half, researchers said. A larger field study will begin this winter.
Most study participants say texting or email is currently their preferred method for distance communication. However, text-based communication can lead to mix-ups or misinterpretation.
‘Sometimes with texting people will be confused about what it really means,’ said Tong Song, a Chinese student at Gallaudet University in Washington, DC. ‘With the MobileASL phone people can see each other eye to eye, face to face, and really have a better understanding.’
Some students already use video chat on a laptop, home computer or video phone terminal, but none of these existing technologies for transmitting sign language fits in a pocket.
‘We want to deliver affordable, reliable ASL on as many devices as possible,’ Riskin said. ‘It’s a question of equal access to mobile communication technology.’