RoboPatient combines AR and robotics to train medics

1 min read

A new research project at Imperial College London will use robotics and augmented reality to improve the training of medical students in using physical examinations to assess the condition of organs in the abdomen.

RoboPatient
Doctors use a range of techniques to assess the condition of internal organs. Image: pressmaster via stock.adobe.com

The project, RoboPatient, will also build up an archive which will suggest which techniques are more likely to be successful in a given case.

GPs often use physical examinations, touching and probing the abdomen to detect conditions such as an enlarged liver, swollen intestines, and at a more advanced level, to detect possible tumours.

Doctors use a range of methods, changing the amount of pressure they apply, the shape and configuration of their fingers, and using various tapping techniques, to arrive at a diagnosis, said Dr Thrishantha Nanayakkara, reader in design engineering and robotics at the Dyson School of Design Engineering, Imperial College.

It is a difficult technique to demonstrate to students, and opportunities to practice on real patients are limited.

For the RoboPatient project, which began this month (1 September), Dr Nanayakkara’s team have developed a prototype robotic patient in which silicone rubber laid down in layers is used to realistically simulate soft tissue with various conditions such as swollen organs, or hard nodules. Sensors embedded in the “organs” record pressure applied during an examination, and the time taken to reach a diagnosis. A finite element tissue model to allows the sensor data to be displayed visually.

During the project, experts will initially examine the robot and diagnose the conditions, to provide a baseline of experience. The robot will then be presented to students to allow them to practice diagnosis.

An additional feature being developed will use augmented reality to simulate facial expressions to indicate when the patient feels pain, to encourage trainees to find techniques that minimise the patient’s discomfort.

The robot will record details of each examination, quantifying how quickly students arrive at a diagnosis and allowing comparison with their peers and between techniques.

“We think this will make training more robust and facilitate students to quickly adapt to real patients,” said Dr Nanayakkara.

Over time accumulate data will allow the creation of a probabilistic model to indicate which techniques are most likely to succeed in a given situation.

An initial version of the training module is expected to be in use in a clinical environment by the end of next year, allowing five or six iterations to refine the module by the end of the project in 2020.