3D mapping backpack assists with military planning

A portable laser backpack for 3D mapping has been developed at the University of California, Berkeley, where it is being described as a breakthrough technology capable of producing fast and realistic 3D mapping of interior environments.

Research leading to the development of the reconnaissance backpack was funded by the US Air Force Office of Scientific Research and the Army Research Office under the guidance of programme managers Dr Jon Sjogren (AFOSR) and Dr John Lavery (ARO).

The backpack is the first of a series of similar systems to work without being strapped to a robot or attached to a cart. Its data-acquisition speed is claimed to be very fast, as it collects the data while the human operator is walking; this is in contrast with existing systems that take days or weeks to acquire data.

The portable, laser backpack for 3D mapping developed at the University of California, Berkeley could help with reconnaissance missions
The portable laser backpack for 3D mapping, developed at the University of California, Berkeley, could help with reconnaissance missions

Using this technology, air force personnel will be able to collectively view the interior of modelled buildings and interact over a network in order to achieve military goals such as mission planning.

Under the direction of Dr Avideh Zakhor, lead researcher and UC Berkeley professor of electrical engineering, the scientists have been able to use this more portable method of mapping by way of sensors, or lightweight laser scanners that weigh less than 8oz.

‘We have also developed novel sensor fusion algorithms that use cameras, lasers, range finders and inertial measurement units to generate a textured, photo-realistic 3D model that can operate without GPS input − and that is a big challenge,’ said Zakhor.

There are said to be many basic research issues to achieve a working system, including calibration, sensor registration and localisation.

Using multiple sensors facilitates the modelling process, though the data from various sensors do need to be registered and precisely fused with each other in order to result in coherent, aligned and textured 3D models.

Localisation is another technical challenge, since without it it is not possible to line up scans from laser scanners in order to build the 3D point cloud, which is the first step in the modelling process.

The scientists are looking ahead to how this technology can be used in the future, when they plan to model entire buildings and develop interactive viewers that allow users to virtually walk through buildings before they are there in person.