Hand in glove

Researchers from the Virtual Reality Lab at the University at Buffalo have developed a new tool for transmitting physical touch to the virtual world.

Researchers from the Virtual Reality Lab at the University at Buffalo (UB) have developed a new tool for transmitting physical touch to the virtual world.

Their so-called ‘virtual clay sculpting system’ enables users to replicate in real time on a personal computer the physical act of sculpting a block of clay or other malleable material. The resulting 3D electronic shape shown on screen then can be fine-tuned using standard CAD software.

“This technology will give product designers, or even artists, a tool that will allow them to touch, shape and manipulate virtual objects just as they would with actual clay models or sculptures,” says Thenkurussi Kesavadas, director of the UB Virtual Reality Lab and associate professor of mechanical and aerospace engineering in the UB School of Engineering and Applied Sciences.

Kesavadas developed the tool with Ameya Kamerkar, a graduate student in the UB Department of Mechanical and Aerospace Engineering. A provisional patent on it has been applied for by the UB Office of Science Technology Transfer and Economic Outreach.

The technology uses a ‘ModelGlove’ developed by the researchers to record the force exerted by hand when depressing and shaping a block of clay. This force-feedback information, as well as information on hand position and speed of fingertip motion, is instantaneously communicated to a PC where a virtual block of clay – possessing characteristics mimicking the physical properties of the clay – is shaped precisely to the contouring of the actual clay.

In tests conducted in the UB Virtual Reality Lab, the researchers have used the technology to sculpt and then design a prototype car hood.

The technology improves upon existing freeform NURBS (Nonuniform Rational B-spline Surfaces) modeling techniques, the researchers say, because it is the only technology capable of transferring touch directly from the user’s hand to the virtual object. Other technologies on the market require users to shape a virtual object via mouse and keyboard by clicking on selected points on a virtual object and then inputting data to change its shape.

One such haptic (touch-based) modeling system called FreeForm, which uses the Phantom haptic device, is expensive and is limited in its ability to provide multiple points of contact on its sensing tool, thus making its use rather tedious and non-intuitive, Kesavadas says.

“Our technology is far more intuitive than click-and-drag virtual prototyping tools currently in use,” he explains. “The most natural tool for a designer is his or her hand.”

Currently the ModelGlove is equipped with a single touch sensor on the tip of the index finger. On the computer display, the user’s finger is represented as one of three virtual tools: a sharp tool for making small deep holes, a medium size for gauging or moulding the clay and a large diameter tool for rough shaping of surfaces.

The next generation of the ModelGlove will have sensors on all fingers and on the palm of the hand to give users full finger control of virtual clay. This will enable users to perform complex touch actions – such as kneading the ball of clay – in the virtual environment, according to Kesavadas.

Eventually, the UB researchers hope to develop an array of sculpting tools using the technology.