Business Unusual

Should a machine be allowed to use lethal force?

Micah Johnson is considered to be the first person killed by US police using a remote controlled device. The robot was a bomb disposal unit that police loaded with explosive and detonated near him.

It may be the first time it was done; it will not be the last. The question is under what circumstances could this be allowed and who should make the rules.

The much anticipated 4th Industrial Revolution has often been said to result in the loss of jobs; it is fair to say it will also lead to the loss of life.

Bizarrely, the morality on the use of robots appears as much part of science-fiction as anything else with Isaac Asimov’s Three Laws from his short story Runaround.

  • A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  • A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
  • A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

That did not preclude robots from using lethal force though. Early versions have existed for decades to protect ships from missile attacks. The Lethal Autonomous Weapons, ironically known as LAWS, are far more accurate and dependable than humans at shooting down incoming missiles. Israel uses the Iron Dome system to stop rockets fired on populated areas and South Korea has remotely operated sentry guns along their border with North Korea. The sentry guns first broadcast a warning and require a human operator to actually fire, although they can operate on their own.

The illustrates the three Ds of robotics for ideal use:

  • Dangerous situations
  • Dirty operations
  • Dull repetitive work

The word itself comes from the Slavic languages in which robata were serfs required to work for a certain period of the year for their landlords. In many ways it fits for the work humanity expects from their mechanical descendants.

There are two considerations for when robots would be allowed to use lethal force against humans.

Would the technology be good enough to consider all the options a trained and responsible human would?

Could humanity ethically allow a machine to take a life?

The technology will improve to allow the first part to be satisfied although it will be some time before it could be done with confidence. The Tesla crash in May appears to be as a result of the car's cameras not being able to see the truck it crashed into.

The second even humans struggle with, let alone know how get a machine to determine it.

To illustrate the basic issue consider the Trolley problem. Four scenarios to determine what you would do to avert someone being killed including an option when actively killing one person would spare you killing others.

Driverless car makers are wrestling with what a car should do if placed in a situation that would save others by killing you.

Then there is the issue that, with robots being used for safety work and surveillance, they would then be best placed to react to the situation, especially if others are in harm's way. The use of military surveillance drones to become attack drones followed this path. Some argue that if poor weather or signal jamming might affect the remote control of the drone, a scenario should exist to allow the mission to continue autonomously.

When the next steps are seemingly so close together we are likely to find ourselves having the ability to kill before we have resolved if we should be willing to do so.

The Campaign to Stop Killer Robots hopes to prevent it ever getting to that. At a meeting last year over 1000 experts - including Stephen Hawking and Elon Musk - signed a letter warning that machines should not be allowed to use lethal force.

But perhaps there are scenarios that robots should be allowed to overrule an operator. Consider an aircraft that is made to perform an operation by a pilot that would cause the plane to crash - The German Wings flight 9525 that was intentionally flown into a mountain in a suicide/murder that killed 144. Should aircraft allow operators to do this? In an attempt to make flight more safe, pilots are required to do so little that some argue that in an emergency situation they would not be able to deal with it.

These are serious considerations that are not decades away. Citizens could defer to experts, although it would be best if most were more informed and offer a view of how they would like to imagine a future for themselves and their children.

Read More
The world in 10 years' time

The world in 10 years' time

The six megatrends the World Economic Forum believes will happen in the next decade.

A code of ethics for code

A code of ethics for code

Almost everything runs code now, we assume it is good code, but how would you know?

MailChimp - odds are you received an email from them in the last 7 days

MailChimp - odds are you received an email from them in the last 7 days

Email is 45 years old. MailChimp is 15 and has made this "old" tech big business.

When you find out this is true, it will make you angry

When you find out this is true, it will make you angry

There is no shortage of news stories, but determining which are true is no simple matter.

Whisky - from nasty medicine to the drop of choice for movers and shakers

Whisky - from nasty medicine to the drop of choice for movers and shakers

What began as a search for immortality has turned into a icon of success.

Even death is being disrupted

Even death is being disrupted

Death is final, but changes in cultural norms, technology and the desire to make a quick buck is changing that.