Moving…

All content on this blog from Tim McGhee has moved to the Tim McGhee Substack, and soon, Lord willing, will be found only on that Substack.

Tuesday, September 4, 2018

Killer Robots

In taking a glance at the headlines ending a long weekend, one caught my attention about the US and Russia blocking a ban by killer robots.

The rest of the story is that the US and Russia were also joined by Israel, South Korea, and Australia in opposing a treaty toward this end at the UN.

The outrage machine is kicking into high gear implying anyone not in favor of more entangling alliances, even without a clear definition of a killer robot, must insanely be in favor killer robots.

This is one of a couple leftist arguments. Usually it goes something like: If you're not in favor of a government doing something, then you must inhumanly be opposed to it. I've seen people spin themselves up so far into claiming they don't even want to know someone who is “not in favor of education.” As if.

In this case using that argument in reverse is also a known technique such as in cases like this when it comes to a government body banning something. If you're not in favor of the ban, then you must support it. No nuance whatsoever. It can't be a matter of methodology. It can't be a matter of authority. No, you're just a bad person or bad country because you're now on record supporting killer robots. Not really.

The only supposed consensus on this issue is among small countries that either don't have or can't match larger countries' development of autonomous weapons. The UN was tailor-made for this kind of “consensus” propaganda.

The question of defining a killer robot or autonomous weapon is important. First of all, what is a weapon? How about this for a definition: A weapon is anything that by use of force diminishes the capacity of another. (Siri is a bit narrower and claims a weapon is “used for inflicting bodily harm or physical damage.”)

An autonomous weapon, then, is any weapon that exerts its force autonomously—independent of human interaction or input at the moment of decision or execution. (A working definition for a blog post may vary from a legal definition for an international treaty.)

By that definition, even a land mine is an autonomous weapon, or killer robot, if you will, albeit a primitive one. I would argue that land mines are worse than killer robots. At least with killer robots you can easily put them away when hostilities have ceased. After a conventional war, land mines are still active, and they are still a problem in parts of the world today. To that point, 133 countries have adopted the Ottowa Treaty that bans production, use, transfer, and stockpiling of land mines.

Addressing the question of killer robots won't be as straightforward. Land mines are a device with a single purpose. Anything software-based by definition has multiple potential uses. What about the people who modify a commercial personally used drone into a weapon? Who would an international treaty implicate then?

In theory, the same robot that could aggressively kill could also be used for defensive purposes against a variety of threats. Further, some of the same functions that could be used to kill could also be used to mitigate potential disasters, natural and unnatural.

What of the Isaac Asimov-style dilemmas where robots are faced with two bad choices? For instance, what if a robot was designed to defend low-flying aviation vehicles from the threat of cables strung across a valley but cutting the cable would also cause harm to those in a cable car. It's a rough and unlikely example, but it makes a valid point. Even if not designed as a killer robot, any robot assigned to that task could take action that could be deemed the action of a killer robot. Robots don't have motives, just opportunities, right?

Not only do some robots have software, but these days some robots are all software. Robots don't have to be physical. When algorithms run on their own, they're often called bots, and they could be set to do damage, too. (Google's search algorithm depends on data that its bot goes out and finds autonomously, and for a long time Google joked that killer robots should have exceptions.) What if someone wrote software to hack a nuclear power plant or the electric grid?

This is only a rough scratch of the surface on a host of thorny and complicated issues that arise when it comes to weaponized robots. I am totally fine with the US, Israel, South Korea and others taking their time to be fully clear on the nature of any commitments made on this matter.

Before getting roped into urgency created around the issue, take the time to think the issue and its implications through and realize there's more to national security and international negotiations than emotional headline writing.

No comments:

Blog Archive