RoCycle robot gets a feel for paper, metal and plastic

Researchers in the US have developed RoCycle, a robotic recycling machine that can differentiate between paper, metal or plastic.

RoCycle
RoCycle robotic recycling machine (Pic: MIT CSAIL)

The advance from MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) includes a soft teflon hand that uses tactile sensors on its fingertips to detect an object’s size and stiffness.

Said to be compatible with any robotic arm, RoCycle was 85 per cent accurate at detecting materials when stationary, and 63 per cent accurate on a simulated conveyer belt. Problems arose identifying paper-covered metal tins as paper, which the team said would be improved by adding more sensors along the contact surface.

“Our robot’s sensorised skin provides haptic feedback that allows it to differentiate between a wide range of objects, from the rigid to the squishy,” said MIT professor Daniela Rus, senior author on a new paper about RoCycle that will be presented in April at the IEEE International Conference on Soft Robotics in Seoul, South Korea. “Computer vision alone will not be able to solve the problem of giving machines human-like perception, so being able to use tactile input is of vital importance.”

Developed in collaboration with Yale University, RoCycle is claimed to differentiate between identical-looking Starbucks cups made of paper and plastic that would otherwise be problematic for machine vision systems.

RoCycle’s motor-driven hand is made of an auxetic material that gets wider when stretched. The MIT team created auxetics that, when cut, twist to the left or right. Combining a ‘left-handed’ and ‘right-handed’ auxetic for each of the hand’s two large fingers makes them interlock and oppose each other’s rotation, a property dubbed handed-shearing auxetics (HAS).

“In contrast to soft robots, whose fluid-driven approach requires air pumps and compressors, HSA combines twisting with extension, meaning that you’re able to use regular motors,” said Lillian Chin, a PhD student and lead author on the new paper.

According to CSAIL, the team’s gripper first uses its strain sensor to estimate an object’s size, and then uses its two pressure sensors to measure the force needed to grasp an object. These metrics – along with calibration data on the size and stiffnesses of objects of different material types – is what gives the gripper a sense of what material the object is made of.

“In other words, we estimate the size and measure the pressure difference between the current closed hand and what a normal open hand should look like,” said Chin. “We use this pressure difference and size to classify the specific object based on information about different objects that we’ve already measured.”

The tactile sensors are also conductive, so they can detect metal by how much it changes the electrical signal.

The researchers next plan to build out the system so that it can combine tactile data with video data from a robot’s cameras. This would allow them to further improve its accuracy and potentially allow for even more nuanced differentiation between different kinds of materials.

The project was supported in part by Amazon, JD, the Toyota Research Institute and the US National Science Foundation.

CLICK FOR NEWS