The Duke team combined a version of an existing robot arm with an ultrasound system of its own design. The ultrasound serves as the robot’s eyes by collecting data from its scan and locating its target.
The robot is not controlled by a physician but by an artificial intelligence program that takes real-time 3D information, processes it and then gives the robot specific commands to perform. The robot arm has a mechanical hand that can manipulate the same biopsy plunger device that physicians use to reach a lesion and take samples.
The robot guided the plunger to eight different locations on simulated prostate tissue in 93 per cent of its attempts. This is important because multiple samples can also determine the extent of any lesion, said Smith.
He believes that routine medical procedures, such as biopsies in other tissues in the body, will be performed in the future with minimal human guidance, and at greater convenience and less cost to patients.
An important challenge that needs to be overcome is the speed of data acquisition and processing, although the researchers are confident that faster processors and better algorithms will address this issue. To be clinically useful, all of the robot’s actions would need to be in real time, according to the researchers.
’One of the beauties of this system is that all of the hardware components are already on the market,’ said Smith. ’We believe that this is the first step in showing that, with some modifications, systems such as this can be built without having to develop a new technology from scratch.’
’We’re now testing the robot on a human mannequin seated at the examining table whose breast is constrained in a stiff bra cup,’ he added. ’The breast is composed of turkey breast tissue with an embedded grape to simulate a lesion. Our next step is to move to an excised human breast.’