A wearable projection system that enables users to turn pads of paper, walls or even their own hands into graphical, interactive surface has been developed.
The technology, dubbed ’Omnitouch’, was developed by researchers at Microsoft Research and Carnegie Mellon University. It relies on a depth-sensing camera similar to Microsoft Kinect to track the user’s fingers on everyday surfaces.
Omnitouch claims this allows users to control interactive applications by tapping or dragging their fingers, much as they would with touch screens found on smartphones or tablet computers.
The projector is reported to have the ability to superimpose keyboards, keypads and other controls onto any surface, automatically adjusting the surface’s shape and orientation to minimise distortion of the projected images.
‘It’s conceivable that anything you can do on today’s mobile devices, you will be able to do on your hand using Omnitouch,’ said Chris Harrison, a PhD student in Carnegie Mellon’s Human-Computer Interaction Institute.
The developers claim the palm of the hand could be used as a phone keypad, or as a tablet for jotting down brief notes. In addition, maps projected onto a wall could be panned and zoomed with the same finger motions that work with a conventional multi-touch screen.
The Omnitouch device includes a short-range depth camera and laser pico-projector and is mounted on a user’s shoulder. However, Harrison said the device ultimately could be the size of a deck of cards, or even a matchbox, so that it could fit in a pocket, be easily wearable, or be integrated into future handheld devices.
’With Omnitouch, we wanted to capitalise on the tremendous surface area the real world provides,’ said Hrvoje Benko, a researcher in Microsoft Research’s Adaptive Systems and Interaction group.