Editor
The military robot is on the rise. Indeed, according to the latest figures from the International Federation of Robotics (IFR), defence applications accounted for 40% of the total number of service robots sold last year.
Many of the robots used by the defence sector are relatively benign. A large number of devices are, for instance, used for IED detection or bomb disposal. But, in a technological trend that’s the cause of much ethical hand-wringing, an increasing number of them are being used for offensive purposes.
For military strategists, robot warriers represent an opportunity to put troops out of harm’s way and potentially attack targets with far greater precision, thereby reducing civilian casualties. Whilst for the engineers engaged in their development, the defence sector represents a tantalising proving ground for the technology and one of the few routes to market.
But many are deeply concerned about the implications of the rise of the robot. And it’s not just because their fears have been stoked by Hollwood. The increasingly popular US military strategy ofusing “drones” to carry out targeted killings is, for many, a striking example of why we should be worried about relying too heavily on robots to do our dirty work. Despite denials from Washington, accusations are growing that US drones are responsible for large numbers of civilian casualties, a concern that is backed up by a detailed report published this summer by Stanford and New York Universities.

We’re still a long way away from the so-called Terminator scenario – although the fact that top experts, such as those gathered at this week’s Military Robotics conference in London now regularly talk in po-faced terms about fleets of robot soldiers – is a sign that the gap between science fiction and real life is getting narrower.
And while UAVs represent the most significant deployment of armed robots, the pace of technological development in the research and development community is astonishing.
Leading the charge is DARPA, the US defence department’s advanced research agency, which is driving the development of a range of advanced military robotic systems. Earlier this autumn,DARPA launched its robotic challenge, a competition set up to fast track and stimulate the development of defence robots able to perform human tasks. At the heart of the competition is a company we’ve featured before in The Engineer, Boston Dynamics, whose Big Dog robot can be seen in action here.
As part of the current competition, the firm is now working on the development of ATLAS, an autonomous humanoid robot that will be used as a platform for other competitors to test software and artificial intelligence systems. You can view a video of the latest iteration of ATLAS here. Other robots being developed through the competition include Raytheon’s Guardian system – a lightweight humanoid robot – and Virginia Tech’s THOR (Tactical Hazardous Operations Robot).

It’s fair to say the systems being developed through the competition will do little to calm the nerves of those terrified and concerned by the prospect of science-fiction becoming fact.
But it’s also worth remembering that many of our greatest technical leaps – from the development of nuclear energy, to the invention of the jet engine – have sprung from the incubator of military necessity. And while few us would regard the rise of robot soldiers without a shiver of distaste, the script – unlike that of a Hollywood movie – is not yet written. In conquering the technical challenges of operating robots in a warzone, engineers are making fundamental breakthroughs that could one day benfit mankind in a host of more benign appications.
Are you worried about military robots? Vote in our online poll
In the 80s I assisted in a project with a major US contractor and for the military. It was for the development of a 72 hour ‘suit’ to be worn by special forces and which would offer ballistic, NBC, heat, cold, trauma protection. It contained food & water supplies and waste collection! It had sat-nav and other transmission and receiving systems.We opined that the only offensive way for those inside to bother each other -if both sides had it- was to call each other rude names on the radio! Perhaps robo-squaddie will do the same.
Best
Mike B
I think that a half way house will come first – powered exoskeletons, there has benn plenty of visions in sci fi literature and film… The Matrix, Joe Haldemann’s Forever War/Forever Peace, Starship Troopers…
Very disappointed teh author didn’t take the opportunity to slip in a few gratuitous Terminator quotes, particularly the movie script line at the end. I definitely would have gone with…”The future has not been written. There is no fate but what we make for ourselves”.
Readers may be interested in the conclusions of a retreat organised by the Engineering and Physical Sciences research Council (EPSRC) in 2010 to consider the ethical implications of robotics.
http://www.epsrc.ac.uk/ourportfolio/themes/engineering/activities/Pages/roboticsretreat.aspx
This involved a number of leading scientists in the field, legal experts and social scientists.
I’d be more interested in innerspace nanobots killing cancer cells precisely.
With killings by drones where remains the judicial system of a democracy?
Will those drones be used to kill suspected terrorists, too? Or to listen to conversations of suspects, when the encrypted phone lines cannot be cracked?
Will people need a license and full CRB check for buying remote controlled toys?
I joke to my friends that robots will kill us all but in a sense it’s not really a joke.
Asimov’s 3 laws aren’t going to happen, I believe, because there is no way that all humans all over the world would stick to making robots that fitted that code. Even if most did, a significan few wouldn’t.
Robots will have to be able to make up their own minds and have initiative and creative thinking so on. Whatever safeguards are created will have bugs and there will also be one or two people actively trying to produce unguarded machines because there is always an idiot somewhere.
When these smarter and smarter machines decide they don’t need us, we’ll be in trouble.
The most distasteful responsibilities in war are the largest ones, so it would be logical to see their delegation to machines as the desired endpoint of the process in train here, which appears to be a disengagement by humans from war.
If we delegate our unpalatable responsibilities, we have to live with the decisions of those we delegate to.
Humans generally end up doing the right thing, having tried all the alternatives.
It seems to me that in the last seventy years we’re starting to have to choose between alternatives which are not just unpalatable, but in some cases unsurvivable.
Andrew Troup is to be congratulated for enunciating a possible ‘code’ for Engineers. Let us be clear. If we, who have the privilege of manipulating Nature’s Laws to benefit all mankind -[not as too many other so-called professions do -man’s laws to the benefit of the highest bidder] were via our professional bodies to simply state that we will no longer research into and/or create the tools for conflict, that would be the end of such. I take the 70 year point well: up until August 1945 those in power both political and military could (and did) happily send millions of primarily young men in the ‘services’ to their death: and hardly concern themselves with the effect on civilian populations:knowing full well that ‘they’ personally and their families were very unlikely to suffer or be in receipt of death and destruction. I believe the prospect of their own lives ending involuntarily has concentrated their minds wonderfully in the interim.
We shall overcome?
Best
Mike B
I challenge the contention that this is military necessity, but I would be wrong if the opposition is developing the same technology. If they are (whoever “they” turn out to beI challenge the contention that this is military necessity, but I would be wrong if the opposition is developing the same technology. If they are (whoever “they” turn out to be), then we would have a new spin on the old concept of kings sending forth their champions to settle the dispute in single combat, i.e by the hand of someone expendable.
But modern conflict doesn’t seem to be anywhere near the still-quite-recent concept of mass battles between sovereign states, even though the Iraqi debacle was a notable exception to that trend – at least briefly. Iraq and Afghanistan have soon fallen back into the more typical modern picture where hard military resources on one side are opposing a shadow “army” on the other side and so have, indisputably, become un-winnable. In short, the so-called war on terror is actually an attempt at wrestling smoke. Robots are no better wrestlers than men – maybe even worse!
The problem with using automatons driven by blokes in braces sitting at desks in Langley, Virginia, is that the chosen target in this “war” cannot fight back. The thuggish one-sidedness of stand-off cyber conflict cannot be countered by the weaker party, unless he uses stealth to get at the guy in Langley. His only way to exploit that weapon is actions of the 9-11 type, which the perpetrators of that particular outrage would describe as retaliation anyway, just as the west regarded the subsequent invasions of Afghanistan and Iraq. Look where retaliation got us.
I believe that the use of UAVs is already building this mood in the minds of the communities that are able to recruit suicide bombers and it can only get worse if the concept is ever extended to ground forces of occupation. The answer is not to build substitutes for human forces of occupation, but to build the consensus that’s puts an end to the desire to occupy.
), then we would have a new spin on the old concept of kings sending forth their champions to settle the dispute in single combat, i.e by the hand of someone expendable.
But modern conflict doesn’t seem to be anywhere near the still-quite-recent concept of mass battles between sovereign states, even though the Iraqi debacle was a notable exception to that trend – at least briefly. Iraq and Afghanistan have soon fallen back into the more typical modern picture where hard military resources on one side are opposing a shadow “army” on the other side and so have, indisputably, become un-winnable. In short, the so-called war on terror is actually an attempt at wrestling smoke. Robots are no better wrestlers than men – maybe even worse!
The problem with using automatons driven by blokes in braces sitting at desks in Langley, Virginia, is that the chosen target in this “war” cannot fight back. The thuggish one-sidedness of stand-off cyber conflict cannot be countered by the weaker party, unless he uses stealth to get at the guy in Langley. His only way to exploit that weapon is actions of the 9-11 type, which the perpetrators of that particular outrage would describe as retaliation anyway, just as the west regarded the subsequent invasions of Afghanistan and Iraq. Look where retaliation got us.
I believe that the use of UAVs is already building this mood in the minds of the communities that are able to recruit suicide bombers and it can only get worse if the concept is ever extended to ground forces of occupation. The answer is not to build substitutes for human forces of occupation, but to build the consensus that’s puts an end to the desire to occupy.
The greatest danger embodied in robot soldiers is that governments will ultimately use them to turn on their own citizens when those citizens don’t “cooperate” with government policies. Today, the soul of the human soldier that causes him NOT to act on an order to fire upon his fellow citizens at the order of his government is the last line of defense that we civilians have. No matter what the governing documents of a nation might say about use of its military for domestic purposes, when its government knows that all it has to do is push a button to use its national military to robotically suppress its citizenry, whatever personal freedoms exist will soon fade away.
We already have an army of robot controllers out there training daily on their Xbox etc. who will be bravestupid enough to give these “virtual soldiers” the kit they need to do it for real
I think it to be valuable to crosslink this article with the following, showing what is currently possible.
This provides valuable depth and a ‘where we are now’ snapshot of this area.
Unfortunately history has proved it is only through military need that we have developed suitable technology to progress into our everyday lives. Unfortunately the military need inevitably leads to a level of death and destruction before such technologies progress to a more useful arena.
https://www.theengineer.co.uk/video/us-engineers-show-off-humanoid-military-robot/1014600.article