Cornell and Pennsylvania Universities develop modular robot system that can perceive its surroundings and autonomously assume different shapes suited to the task it has been given
No matter how useful a general-purpose robot can be, they have inherent limitations. They can be cumbersome and can often only accomplish a single type of task. A team at Cornell University in New York state has led a project to develop a modular robot – composed of several interchangeable parts – that can perceive it surroundings and reconfigure itself to the best form to accomplish its task.
The team, from the Sibley School of Mechanical and Aerospace Engineering and led by principal investigator Hadas Kress-Gazit, describes its work in the current edition of the journal Science Robotics. “This is the first time modular robots have been demonstrated with autonomous reconfiguration and behavior that is perception-driven,” commented Kress-Gazit.
The system, called SMORES-EP (Self-Assembling MOdular Robot for Extreme Shapeshifting, and EP refers to the Electro-Permanent magnets the modules use to connect) is composed of wheeled, cube-shaped modules equipped with Wi-Fi to communicate with each other. They were developed by researchers at the University of Pennsylvania. Crucial to the system is one sensor module equipped with multiple cameras and a small computer for collecting and processing data about it surroundings. This module includes high-level planning software and perception algorithms that can map, navigate and classify the environment.
The system is organised around a library of different configurations, which was populated at an early stage in the research using design competitions among Cornell students to invent and test different shapes. This library now consists of 57 possible robot combinations, such as Proboscis (with a long arm in front), Scorpion (modules arranged in perpendicular lines, with a horizontal row in front) and Snake (modules in a single line), and 97 behaviours, such as pickup, high-reach, drive or drop. Once the robot is given a task, its high-level planner searches the library for shapes and behaviors that meet the current needs.
“I want to tell the robot what it should be doing, what its goals are, but not how it should be doing it,” said Kress-Gazit. “I don’t actually prescribe, ‘Move to the left, change your shape.’ All these decisions are made autonomously by the robot.”
In the Science Robotics paper, the team describes three experiments to test the system. In the first, a robot was instructed to find, retrieve and deliver all pink and green objects to the designated zone marked by blue square on the wall. The robot is used the ‘car’ configuration to explore it surroundings, then reshaped itself into ‘proboscis’ to retrieve a pink object from a narrow pathway, then change back to ‘car’ to deliver it. In the second task, the robot had to place a circuit board in a post box marked with pink tape at the top of the flight of stairs; and in the third it had to place a postage stamp on top of the box.
Researchers found the hardware and low-level software were most prone to error. The second experiment, for instance, took 24 attempts before succeeding, with the stairs posing a particular challenge.
Kress-Gazit believes that modular robots could be particular useful for jobs on challenging terrain, such as cleaning up from the earthquake or natural disaster where it might need to enter narrow cracks and crevices.
“Modular robots in general are just fascinating systems, because you’re not restricted by one shape, so there’s a lot of flexibility,” she said. “The hardware is still in research stages, but if we had commercial modular robots they would be very useful for anything where the environment changes significantly and the robot should adapt to its environment as well.”