Developing autonomous fighting machines

War machines: Accelerating the development of autonomous defence systems raises both ethical and technological questions

If you’re the sort of person who takes Hollywood blockbusters seriously, then defence research is an inherently risky thing. Advanced weaponry rarely works and it’s left to plucky mavericks to save the day. Most risky of all are autonomous robotic defence systems; according to the Terminator films, they’re bound to become self aware, decide that humanity is a threat and unleash a nuclear armageddon to wipe pesky Homo sapiens off the face of the planet.

 

In reality, autonomous systems are firmly in place as an integral part of the armed forces’ arsenal. For instance, unmanned aerial vehicles (UAVs) often referred to as drones are regularly used to perform reconnaissance and attack missions, with BAE Systems’ Mantis among the systems carrying out observation and the General Atomics Predator capable of using missiles and other weaponry.

What’s more, the use of autonomous systems in defence looks set to increase in the coming years. The UK’s armed forces have asked for studies of numerous scenarios involving unmanned vehicles both on the ground and in the air, with the aim of removing humans, wherever possible, from hazardous situations.

20-22 TFV-2.eps.jpg

Sharkey thinks that, in a practical sense, control will be via ’a kind of autonomy’. He said: ’It’ll move at very high speed and the operator will get snapshots with the occasional chance of veto.’

Within the defence industry, attention is turning to control systems for unmanned vehicles, but the focus is currently more on the operability of the system than matters such as target identification. ’The control systems for the Future Protected Vehicle concepts are the most important systems on board,’ said Awad. ’We’re concentrating heavily on non-line-of-sight communication and, even more critically, the navigational algorithms. What the vehicle looks like is immaterial to this; we’re working on vehicle-agnostic systems.’

The philosophy Awad is using is a multi-level control system with differing levels of intelligence. ’All of the vehicles would have the basic instruction set and we’d add to that depending on the complexity of the task. But as soon as you have machines working alongside people, that task becomes very different and very, very difficult.’

Unmanned vehicles tend to operate in a very controlled environment, Awad explained; for example, UAVs only fly in regions where they will not encounter any enemy aircraft. ’With ground systems, if there are any humans walking about, you need to know where they are. The big area for improvement is to make sure the vehicle navigation algorithms are very well tested and robust. At the moment, we’re working on collision avoidance and we’re only doing that with unarmed vehicles.’

When it comes to more complex missions, however, the concept of a robot running an artificial-intelligence system is, Sharkey thinks, unlikely. ’You need human reasoning behind it. You can’t put the Geneva Convention into a computer; it isn’t specified for a computer and it contains ambiguity and apparent contradictions that need human reasoning to make sense of. We have no idea how reasoning works or even where it comes from.’

But one of the biggest ethical questions is what the use of unmanned systems does to the concept of war. The DCDC report goes into detail about this using unmanned systems on the battlefield removes the element of risk to personnel. ’This raises a number of areas for debate, not least being the school of thought that suggests that for war to be moral (as opposed to just legal) it must link the killing of enemies with an element of self sacrifice, or at least risk to oneself,’ the report says.

This poses the risk of war becoming too easy. ’Here’s the big danger,’ said Sharkey. ’About 46 countries are developing robotic defence systems and there’s been no international discussion whatsoever. I can easily picture aerial robots dropping ground robots and using a few special forces to guide them, but that lowers the barriers for going to war. Nobody wants to see body bags, but if we don’t have the potential for body bags, we can go to war whenever and wherever we like.’

flying solo - birds of prey

Vulture II and SolarEagle are just two of the UAVs that are being developed for autonomous defenceAmong the applications being developed for autonomous defence is DARPA’s Vulture II programme, for which it has awarded an $89m (£54m) contract to Boeing. This involves developing a UAV that can stay on station for a long period, observing a location and reporting back to base. Boeing is developing a solar-electric craft called SolarEagle, which will be able to stay in the stratosphere for five years.

SolarEagle is a 400ft (122m)-wingspan craft with solar cells in its wings that generate electricity during the day and fuel cells to store the power for use during the night. ’It’s a daunting task, but Boeing has a highly reliable solar-electric design that will meet the challenge in order to perform persistent communications, intelligence, surveillance and reconnaissance missions from altitudes of more than 60,000ft [18.2km],’ said programme manager Pat O’Neil. First flights are planned for 2014, with a one-month duration as a first target.

SolarEagle would carry a variant of a video-capture system called Gorgon Stare. This is a spherical array of nine cameras, providing an extremely wide-angle view of an area as large as a city. SolarEagle would act as a long-term ’spy in the sky’, monitoring people and vehicles, with unusual patterns triggering more detailed reconnaissance or offensive action.

brain wave - unmanned territory

Systems inspired by the human mind could help machines to make battlefield decisions An intriguing DARPA programme aims to create electronic systems inspired by the human brain to assist unmanned systems to make decisions on the battlefield. Dubbed SYNAPSE (Systems of Neuromorphic Adaptive Plastic Scalable Electronics), the project is developing systems with integrated communication networks linking dense islands of processing devices, similar to the way that neurons and synapses are organised in the brain.

The project has already completed a first phase, where researchers built nanometre-scale components that could adapt the connection strength between two ’electronic neurons’. The next phase is more challenging to move this organisation into a larger-scale system and develop organisation between groups of synapses and neurons.

’So far, SYNAPSE has successfully demonstrated all the core hardware, architecture, simulation and evaluation capabilities needed for a new generation of intelligent electronic machines,’ said programme manager Todd Hylton. ’Now that all the building blocks are available, our next task is to start building functioning systems out of them.’