Engineers at MIT have demonstrated an updated version of their robotic cheetah technology that is able to climb stairs and gallop across rough terrain without using cameras or any other visual sensors.
Algorithms process data gathered from the robot’s legs
Instead, the Labrador-sized Cheetah 3 robot “feels” its way through its surroundings, opening up potential applications for exploring disaster zones and other dangerous or inaccessible environments.
“Vision can be noisy, slightly inaccurate, and sometimes not available, and if you rely too much on vision, your robot has to be very accurate in position and eventually will be slow,” said the system’s designer, Prof Sangbae Kim. “We want the robot to rely more on tactile information. That way, it can handle unexpected obstacles while moving fast.”
Commenting on potential applications for the robot Kim said: “Cheetah 3 is designed to do versatile tasks such as power plant inspection, which involves various terrain conditions including stairs, curbs, and obstacles on the ground.”
The robot owes its abilities to new algorithms developed by Kim’s team: a contact detection algorithm, and a model-predictive control algorithm.
The contact detection algorithm uses data from gyroscopes, accelerometers and the joint positions of the legs to help the robot determine the best time for a given leg to switch from swinging in the air to stepping on the ground. For example, if it steps on a light twig versus a hard, heavy rock, how it reacts – and whether it continues to carry through with a step, or pulls back and swings its leg instead – can make or break its balance.
The researchers tested the algorithm in experiments with the Cheetah 3 trotting on a laboratory treadmill and climbing on a staircase. Both surfaces were littered with random objects such as wooden blocks and rolls of tape. “It doesn’t know the height of each step, and doesn’t know there are obstacles on the stairs, but it just plows through without losing its balance,” Kim said. “Without that algorithm, the robot was very unstable and fell easily.”
The robot also uses a model-predictive control algorithm, which predicts how much force a given leg should apply once it has committed to a step. “The contact detection algorithm will tell you, ‘this is the time to apply forces on the ground,'” Kim said. “But once you’re on the ground, now you need to calculate what kind of forces to apply so you can move the body in the right way.”
This algorithm – which makes calculations for each leg 20 times a second – also helps the robot deal with unexpected collisions. “Say someone kicks the robot sideways,” explained Kim. “When the foot is already on the ground, the algorithm decides, ‘How should I specify the forces on the foot? Because I have an undesirable velocity on the left, so I want to apply a force in the opposite direction to kill that velocity. If I apply 100 newtons in this opposite direction, what will happen a half second later?”
In experiments, researchers introduced unexpected forces by kicking and shoving the robot as it trotted on a treadmill, and yanking it by the leash as it climbed up an obstacle-laden staircase. They found that the model-predictive algorithm enabled the robot to quickly produce counter-forces to regain its balance and keep moving forward, without tipping too far in the opposite direction.