MIT researchers have developed a team of autonomous forest search-and-rescue drones that operate under dense forest canopies using onboard computation and wireless communication.

In a paper being presented at the International Symposium on Experimental Robotics conference (Nov 5-8), MIT researchers describe their autonomous system, which overcomes limitations of using GPS in forest environments.
Each autonomous quadrotor drone is equipped with laser-range finders for position estimation, localisation, and path planning. As the drone flies around, it creates an individual 3D map of the terrain. Algorithms help it recognise unexplored and already-searched spots, and an off-board ground station fuses individual maps from multiple drones into a global 3D map that can be monitored by human rescuers.
In a real-world implementation, though not in the current system, the drones would come equipped with object detection to identify a missing hiker. When located, the drone would tag the hiker’s location on the global map. Humans could then use this information to plan a rescue mission.
“Essentially, we’re replacing humans with a fleet of drones to make the search part of the search-and-rescue process more efficient,” said first author Yulun Tian, a graduate student in the Department of Aeronautics and Astronautics.
The researchers tested multiple drones in simulations of randomly generated forests and tested two drones in a forested area. In both experiments, each search-and-rescue drone mapped an area roughly 20m2 in about two to five minutes and collaboratively fused their maps together in real-time. The drones are also said to have performed well across several metrics, including overall speed and time to complete the mission, detection of forest features, and accurate merging of maps.
A LIDAR system is mounted on each drone to create a 2D scan of the surroundings. To distinguish between individual trees – and increase exploratory efficiency – the researchers programmed their drones to identify multiple trees’ orientations. With this method, when the LIDAR signal returns a cluster of trees, an algorithm calculates the angles and distances between trees to identify that cluster.
“Drones can use that as a unique signature to tell if they’ve visited this area before or if it’s a new area,” Tian said.
This feature-detection technique helps the ground station accurately merge maps. The drones generally explore an area in loops, producing scans as they go that are continuously monitored by the ground station. When two drones loop around to the same cluster of trees, the ground station merges the maps by calculating the relative transformation between the drones, and then fusing the individual maps to maintain consistent orientations.
“Calculating that relative transformation tells you how you should align the two maps so it corresponds to exactly how the forest looks,” Tian said.
In the ground station, robotic navigation software called “simultaneous localisation and mapping” (SLAM) – which maps an unknown area and keeps track of an agent inside it – uses the LIDAR data to localise and capture the position of the drones, helping it to accurately fuse the maps.
The result is a map with 3D terrain features. Trees appear as blocks of coloured shades of blue to green, depending on height. Unexplored areas are dark but turn grey as they’re mapped by a drone. On-board path-planning software tells a drone to always explore these dark unexplored areas as it flies around.
According to MIT, a current limitation for practical forest search and rescue use is that the drones still must communicate with an off-board ground station for map merging. In their outdoor experiment, the researchers set up a wireless router that connected each drone and the ground station. In the future, they hope to design the drones to communicate wirelessly when approaching one another, fuse their maps, and then cut communication when they separate. The ground station would then be used to monitor the updated global map.
I volunteer in SAR work, so I’ve an interest in this beyond the usual engineering side of thing.
This looks very interesting, but there are lot of unanswered questions here.
The ‘object detection’ is rather skimmed over here, and this mostly appears to be a mapping tool. We’ve already got maps. How does the object detection work, and how does it filter out wildlife? If it’s just for mapping, how does it distinguish between clearings and lakes, for example? How does the ‘tree angle measuring’ work when everything is blowing about?
SAR rarely seems to happen on a nice sunny day – it’s usually 2am, cold, snowing and/or hailing with a 50mph wind. What are the limits to the drones flight and detection capability? It’ll be fun to launch one of these and immediately watch it get blown 30miles away. The cold is a killer to battery life too.
Talking of which – how heavy are these, and what is the range? I’ve got enough kit to carry as it is, I’m not going to be popular if I’m asking my team to hump an extra 10Kg of drone in and out.
And the important one – comms. I’m 30km out in the hills, it’s howling a gale, dark, under trees, and the VHF and GPS reception is poor, and don’t even think about using mobile phones. How do I get the details the drones are collecting?