Camera ensures control of robotic nurse is all in hand

1 min read

Purdue University is pioneering the development of a surgical system that responds to a surgeon’s hand gestures to control a robotic scrub nurse or command the display of images during an operation. 

According to Juan Pablo Wachs, assistant professor of industrial engineering at the university, the hand-gesture-recognition and robotic-nurse innovations might help to reduce the length of surgeries and the potential for infection.

Surgeons routinely need to review medical images and records during surgery, but stepping away from the operating table and touching a keyboard and mouse can delay the surgery, and increase the risk of spreading infection-causing bacteria.

The new approach is a system that uses the Kinect camera from Microsoft and specialised algorithms to recognise hand gestures as commands to instruct a computer or robot.

At the same time, a robotic scrub nurse represents a potential tool that might improve operating-room efficiency, Wachs said.

The research into hand-gesture recognition began several years ago in work led by the Washington Hospital Center and Ben-Gurion University, where Wachs was a research fellow and doctoral student.

He is now working to extend the system’s capabilities in research with Purdue’s School of Veterinary Medicine and the Department of Speech, Language and Hearing Sciences.

‘One challenge will be to develop the proper shapes of hand poses and the proper hand-trajectory movements to reflect and express certain medical functions,’ said Wachs. ‘You want to use intuitive and natural gestures for the surgeon, to express medical image-navigation activities, but you also need to consider cultural and physical differences between surgeons. They may have different preferences regarding what gestures they may want to use.’

Other challenges include providing computers with the ability to understand the context in which gestures are made and to discriminate between intended gestures versus unintended gestures.

Wachs is developing advanced algorithms that isolate the hands and apply anthropometry, namely predicting the position of the hands based on knowledge of where the surgeon’s head is. The tracking is achieved through a camera mounted over the screen used for the visualisation of images.

Advances in robotics are helping surgeons perform a variety of increasingly complex procedures. Click here to read more.