Dubbed Shape-It-Up, the tool interprets hand gestures, enabling designers to create and modify three-dimensional shapes using their hands as a ‘natural user interface’ instead of keyboard and mouse.
The tool’s algorithms recognise a hand gesture, understands that the hand is interacting with a shape and then modifies that shape in response to the hand interaction. A Microsoft Kinect depth-sensing camera observes and interprets hand movements and gestures.
The user is then able to create shapes in a computer by interacting with a virtual workspace as the shape is displayed on a large-screen monitor.
‘You create and modify shapes using hand gestures alone, no mouse or keyboard,’ said Karthik Ramani, Purdue University’s Donald W. Feddersen Professor of Mechanical Engineering. ‘By bringing hands into the virtual space with a single depth camera we are able to manipulate the 3D artefacts as if they actually exist.’
Researchers call the underlying technique shape–gesture–context interplay.
The tool is claimed to be an advance over a previous version that was limited to creating “rotationally symmetric” objects, or those having the same measurements on all sides.
Ramani said the tool could have applications in areas including engineering design, games, architecture, and art, and could also serve the emerging ‘creative maker’ community.
In a statement Ramani said, ‘Our goal is to make the designer an integral part of the shape-modelling process during early design, which isn’t possible using current CAD tools.
‘The conventional tools have non-intuitive and cognitively onerous processes requiring extensive training. We conclusively demonstrate the modelling of a wide variety of asymmetric 3D shapes within a few seconds.
‘One can bend and deform them in various ways to explore new shapes by natural interactions. The effect is immediate.’
The creations can then be produced using a 3D printer.
The research findings have been published in a paper, which was co-authored by Ramani, graduate students Vinayak and Sundar Murugappan and postdoctoral researcher HaiRong Liu. The paper is available here.
The team will demonstrate the technology at the Maker Faire on May 18 & 19 in the San Mateo County Event Center, California.