Banning Nuclear And Autonomous Weapons

[They're not] an answer to any of the threats that we face right now, be it climate change [or] terrorism. ... it’s only adding more fuel to an already dangerous world."
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.
What sort of society do we want to live in, and how much are we prepared to hand over to computers and machines?
What sort of society do we want to live in, and how much are we prepared to hand over to computers and machines?
shutterstock

By Ariel Conn

How does a weapon go from one of the most feared to being banned? And what happens once the weapon is finally banned?

With tensions increasing between various nuclear-armed countries, and with the potential for a new arms race of lethal autonomous weapons, these questions seem more timely than ever. For answers, I turned to security and disarmament experts Miriam Struyk, programs director at PAX, and Richard Moyes, managing director of Article 36, both of whom have participated in efforts to ban landmines, cluster munitions, nuclear weapons and autonomous weapons.

The following interview has been heavily edited for brevity, but you can listen to it in its entirety here.

The United Nations is about to complete negotiations that are expected to result in an international ban on nuclear weapons as early as July 7. Why is this ban important, even if nuclear weapons states don’t sign?

Moyes: The use of a single nuclear weapon would potentially kill hundreds of thousands of people. The use of multiple nuclear weapons could have devastating impacts for society and the environment as a whole. These weapons should be illegal because their effects cannot be contained or managed in a way that avoids massive suffering. By changing that legal background, we’re potentially in position to put much more pressure on [the nuclear] states to move towards disarmament as a long-term agenda.

Struyk: For too long nuclear weapons were mythical, symbolic weapons, but we never spoke about what these weapons actually do and whether we think that’s illegal. This treaty brings back the notion of what do these weapons do and do we want that? It also brings democratization of security policy. This process was brought about by several states, by NGOs [nongovernmental organizations], the ICRC [International Committee of the Red Cross] and other actors. It’s so important that it’s actually citizens speaking about nukes and whether we think they’re acceptable or not.

“These weapons should be illegal because their effects cannot be contained or managed in a way that avoids massive suffering.”

- Richard Moyes

What is an autonomous weapon system and why are you worried about them?

Moyes: Autonomous weapons are really an issue of the challenges that new and emerging technologies present to society, particularly when they’re emerging in the military sphere — a sphere which is essentially about how we’re allowed to kill each other or how we’re allowed to use technologies to kill each other. Autonomous weapons are a movement in technology to a point where we will see computers and machines making decisions about where to apply force, about what objects to destroy, or about who to kill.

One of the ways we’ve sought to orientate to this is by thinking about the concept of meaningful human control. What are the human elements that we feel are important to retain? We are going to see more and more autonomy within military operations. But in certain critical functions around how targets are identified and how force is applied and over what period of time — those are areas where we will potentially see an erosion of a level of human, essentially moral, engagement that is fundamentally important to retain.

“Autonomous weapons are a movement in technology to a point where we will see computers and machines making decisions about where to apply force, about what objects to destroy, or about who to kill.”

- Richard Moyes

Struyk: It depends a lot on your definition of course. I’m still, in a way, a bit of an optimist by saying that perhaps we can prevent the emergence of lethal autonomous weapon systems. But I also see some similarities between lethal autonomous weapons systems and what we had with nuclear weapons a few decades ago — this can lead to an arms race, to more global insecurity, and also to warfare.

An argument in favor of autonomous weapons is that they can ideally make decisions better than humans and potentially reduce civilian casualties. How do you address that argument?

Struyk: We’ve had that debate with other weapon systems, as well, where the technological possibilities were not what they were promised to be as soon as they were used.

It’s an unfair debate because it’s mainly from states with developed industries who are most likely the ones using some form of lethal autonomous weapons systems first. Flip the question and say, ‘what if these systems will be used against your soldiers or in your country?’ Suddenly you enter a whole different debate. I’m highly skeptical of people who say it could actually be beneficial.

“What if these systems will be used against your soldiers or in your country? Suddenly you enter a whole different debate.”

- Miriam Struyk

Moyes: I feel like there are assertions of “goodies” and “baddies” and our ability to label one from the other. To categorize people and things in society in such an accurate way is somewhat illusory and something of a misunderstanding of the reality of conflict.

Any claims that we can somehow perfect violence in a way where it can be distributed by machinery to those who deserve to receive it and that there’s no tension or moral hazard in that — that is extremely dangerous as an underpinning concept because, in the end, we’re talking about embedding categorizations of people and things within a micro-bureaucracy of algorithms and labels.

Violence in society is a human problem, and it needs to continue to be messy to some extent if we’re going to recognize it as a problem.

“Violence in society is a human problem, and it needs to continue to be messy to some extent if we’re going to recognize it as a problem.”

- Richard Moyes

What is the process right now for getting lethal autonomous weapons systems banned?

Struyk: We started the international Campaign to Stop Killer Robots in 2013 — it immediately gave a push to the international discussion, including the one on the Human Rights Council and within the [Convention on] Conventional Weapons (CCW) in Geneva. We saw a lot of debates there in 2013, 2014 and 2015 and the last one was in April.

Unfortunately, we’re in a bit of a silence mode right now. But that doesn’t mean there’s no progress. We have 19 states who called for a ban, and more than 70 states within the CCW framework discussing this issue. We know from other treaties that you need these kind of building blocks.

“Handing more and more violence over to such processes does not augur well for our societal development.”

- Richard Moyes

What is most important for people to understand about nuclear and autonomous weapon systems?

Struyk: Both systems go way beyond the discussion about weapons systems: it’s about what kind of world and society do we want to live in. None of these — not killer robots, not nuclear weapons — are an answer to any of the threats that we face right now, be it climate change, be it terrorism. It’s not an answer. It’s only adding more fuel to an already dangerous world.

Moyes: Nuclear weapons — they’ve somehow become a very abstract, rather distant issue. Simple recognition of the scale of humanitarian harm from a nuclear weapon is the most substantial thing — hundreds of thousands killed and injured. [Leaders of nuclear states are] essentially talking about incinerating hundreds of thousands of normal people — probably in a foreign country — but recognizable, normal people. The idea that that can be approached in some ways glibly or confidently at all is I think very disturbing. And expecting that at no point will something go wrong — I think it’s a complete illusion.

On autonomous weapons — what sort of society do we want to live in, and how much are we prepared to hand over to computers and machines? I think handing more and more violence over to such processes does not augur well for our societal development.

Popular in the Community

Close

What's Hot