According to researchers, a drone trained with their algorithm flew through a simple obstacle course up to 20 per cent faster than a drone trained on conventional planning algorithms. The research has been published in the International Journal of Robotics Research.
The algorithm was tested in the context of autonomous drone racing, where crashes are a common occurrence due to unpredictable aerodynamics at high speeds, but the team believes the algorithm could be used to improve drones’ performance in time-critical operations and complex environments beyond the race course such as searching for survivors in a natural disaster.
Researchers said the new algorithm did not always keep drones ahead of competitors, in some cases choosing to slow a drone down to handle curves or to save energy in order to speed up and overtake its rival.
“When you’re flying fast, it’s hard to estimate where you are,” said MIT graduate Gilhyun Rou. “There could be delays in sending a signal to a motor, or a sudden voltage drop which could cause other dynamics problems. These effects can’t be modelled with traditional planning approaches.”
The training process to improve understanding of how high-speed aerodynamics affect drones in flight involves multiple lab experiments, often resulting in crashes and becoming expensive. For this reason, the MIT team’s algorithm aims to minimise the number of experiments required to identify safe and fast flight paths.
Researchers reportedly started with a physics-based flight planning model, developed to first simulate how a drone is likely to behave while flying through a virtual obstacle course. They then simulated thousands of racing scenarios with different flight paths and speed patterns, charted whether each scenario was feasible or infeasible, and could then focus on a handful of the most promising scenarios to try out in the lab.
To demonstrate the approach, the team said they simulated a drone flying through a simple course with five staggered large, square obstacles. They set the same configuration up in a physical training space and programmed a drone to fly through the course at speeds and trajectories picked out from their simulations. They also ran the same course with a drone trained on a more conventional algorithm that does not incorporate experiments into its planning.
The drone trained on the new algorithm ‘won’ every race, completing the course faster than the conventionally trained drone — in some cases, finishing the course 20 per cent faster than its competitor despite a slower start.
Researchers plan to fly more experiments, at faster speeds and through more complex environments, to further improve the algorithm. They also may incorporate flight data from human pilots who race drones remotely, whose decisions and manoeuvres could help focus on faster yet still feasible flight plans.