Smart tablet gives fingertip control of swarm robots

Researchers have used a smart tablet and a red beam of light to create a system that allows people to control a fleet of robots with the swipe of a finger.

The system developed at the Georgia Institute of Technology in Atlanta, US allows the user to tap the tablet to control where the beam of light appears on a floor.

Swarm robots then roll toward the illumination, constantly communicating with each other and deciding how to evenly cover the lit area.

When the person swipes the tablet to drag the light across the floor, the robots follow. If the operator puts two fingers in different locations on the tablet, the machines split into teams and repeat the process.

The new Georgia Tech algorithm that fuels this system demonstrates the potential of easily controlling large teams of robots, which is relevant in manufacturing, agriculture and disaster areas.

“It’s not possible for a person to control a thousand or a million robots by individually programming each one where to go,” Magnus Egerstedt, Schlumberger Professor in Georgia Tech’s School of Electrical and Computer Engineering, said in a statement. “Instead, the operator controls an area that needs to be explored. Then the robots work together to determine the best ways to accomplish the job.”

Egerstedt envisions a scenario in which an operator sends a large fleet of machines into a specific area of a disaster-ravaged region. The robots could search for survivors, dividing themselves into equal sections. If some machines were suddenly needed in a new area, a single person could quickly redeploy them.

The Georgia Tech model is different from many other robotic coverage algorithms because it is not static. It is flexible enough to allow robots to ‘change their minds’ effectively, rather than just performing the single job they are programmed to do.

“The field of swarm robotics gets difficult when you expect teams of robots to be as dynamic and adaptive as humans,” Egerstedt said. “People can quickly adapt to changing circumstances, make new decisions and act. Robots typically can’t. It’s hard for them to talk and form plans when everything is changing around them.”

In the Georgia Tech demonstration, each robot is constantly measuring how much light is in its local ‘neighbourhood’. It is also communicating with its neighbour. When there is too much light in its area, the robot moves away so that another can acquire some of its light.

“The robots are working together to make sure that each one has the same amount of light in its own area,” said Egerstedt.

The tablet-based control system was designed with everyone in mind. Anyone can control the robots, even if they do not have a background in robotics.

“In the future, farmers could send machines into their fields to inspect the crops,” said Georgia Tech PhD candidate Yancy Diaz-Mercado. “Workers on manufacturing floors could direct robots to one side of the warehouse to collect items, then quickly direct them to another area if the need changes.”

A paper about the control system, Multi-Robot Control Using Time-Varying Density Functions, has been published in the IEEE Transactions on Robotics (T-RO).