The morality of the uses to which technology can be put is something that should concern every engineer. Engineering is all about consequences.
Architecture and morality
Our poll this week discusses whether a ban or moratorium should be put in place on development of armed autonomous systems, or ‘killer robots’ to use a less jargony term. It’s sparked an interesting debate in a related viewpoint piece by Sheffield-based roboticist Noel Sharkey about whether it’s appropriate for engineers to make ethical judgements about their work and its applications. Some respondents have said that this publication is not a suitable arena for such discussions.
Of course, this is a debate which I can’t resist sticking my oar into.
My immediate response is ‘yes, of course it’s appropriate.’ The opposite point of view is very much a politician’s opinion. During the Second World War, Winston Churchill opined that scientists (and we can assume, by extension, engineers) should be ‘on tap, not on top’. Such a point of view might have been fine during wartime, when every available mind was turned towards defeating a common enemy, but the attitude of ‘now you boffins go off and play with your toys and turn out what we tell you to, and don’t worry yourselves about what we do with them, there’s good chaps’ is not only patronising in the extreme, but underestimates the ability of engineering and engineers to take into account factors other than the application of physical laws to real-world problems.
Not everyone agrees with this opinion. Recently, triggered by the BBC’s excellent docudrama about Richard Feynman’s part in the enquiry following the Challenger Space Shuttle disaster, I’ve been reading the great physicist’s books, and it’s interesting to see his take on the Manhattan Project. Feynman was a fairly junior player in the great drama at Los Alamos, but nonetheless spent a lot of time with the senior scientists — all of whom, he points out, were essentially working as engineers. Physics stopped for the duration of the War. Feynman’s point of view is that the physicists should have stopped working on the bomb when it became clear that Germany had been defeated, but they carried on because of their scientific enthusiasm for solving the problem. They became carried away, he said. In the same time period, there’s speculation that the physicist in charge of Germany’s atomic bomb project, Werner Heisenberg, deliberately led his team down blind alleys to prevent the Nazi regime getting its hands on the weapon.
Of course, many would say Feynman was wrong, and the continued development of the bomb, and its subsequent dropping over Hiroshima and Nagasaki shortened the war considerably and saved millions of lives, justifying the horrific effects of the bombs themselves and the decades of tension and uncertainty that followed. Does that imply that scientists and engineers shouldn’t involve themselves in the ethics of technology?
Again, I’d say no. The world of the defence industry and weapons manufacture is a particularly complex one, ethically speaking, and many would rather it wasn’t part of the UK economy. This is a false hope, I think: it’s never been a long way from tool to weapon, and even in the distant past the people who made ploughshares also made swords. But in the final analysis, who understands better what a weapon — or any other piece of technology — is capable of than the engineers who designed and built it? And why shouldn’t they express their views?
We’re constantly told that engineers have to be more entrepreneurial; to understand the drivers of economics and how that affects their work. Why should they not also understand the underlying drivers of sociology and ethics? Why should they be expected to just divorce themselves from any application their work might be put to?
Every engineer who takes a role in the defence sector has to make the decision whether that’s what he or she wants to do, and many are comfortable with it: visit Barrow-in-Furness, Samlesbury or Rosyth, to name but three sites, and you’ll find dedicated, skilled engineers who have no problem at all with building nuclear submarines, fighter aircraft and warships, and are proud of their work.
But to say that it isn’t an engineer’s place to think about the consequences of their work is nonsense. Engineering is all about the consequences of work. It is concerned with the fusion of ideas from a variety of different disciplines and sources to reach a solution. To say that the morality of the uses the technology might be put to should not be a part of that process — and, by extension, that an engineering journal shouldn’t discuss that aspect of the discipline — is ridiculous.