A team of Massachusetts researchers has developed an application that allows users to navigate smartphone screens using Google Glass and head movements.
The app, developed primarily for the visually impaired, projects a magnified image of the screen to the Google Glass display. Images are sent using Bluetooth, and the user can interact with them by tapping on the stem of the Google Glass device.
Rather than continually zooming in and out by pinching and swiping on a smartphone, users move their heads around to examine different areas of the screen. According to the researchers, many of those with low vision find the traditional tactile zoom functions on smartphones difficult to use, as they result in a loss of context.
“When people with low visual acuity zoom in on their smartphones, they see only a small portion of the screen, and it’s difficult for them to navigate around – they don’t know whether the current position is in the centre of the screen or in the corner of the screen,” said senior author of the study Gang Luo, associate scientist at the Schepens Eye Research Institute and an associate professor of ophthalmology at Harvard Medical School.
To demonstrate the system’s effectiveness, the researchers conducted a controlled study where one group used the new application, and another used the smartphone’s inbuilt touchscreen zoom. Across a series of tasks, the researchers found that head-based navigation reduced the average completion time by about 28 per cent. The results of the study are published in IEEE Transactions on Neural Systems and Rehabilitation Engineering.
For the next phase of the project, the team wants to incorporate additional gestures to provide more advanced functionality. There are also plans to compare the system’s performance against other smartphone accessibility features such as voice-based navigation.
“Given the current heightened interest in smart glasses, such as Microsoft’s Hololens and Epson’s Moverio, it is conceivable to think of a smart glass working independently without requiring a paired mobile device in near future,” said lead author Shrinivas Pundlik. “The concept of head-controlled screen navigation can be useful in such glasses even for people who are not visually impaired.”