Sunday, February 7, 2010

Can battlefield robots take the place of soldiers?

Can battlefield robots take the place of soldiers?

By Chris Bowlby
Robo Wars, BBC Radio 4

A small-vehicle type land robot at a German army base exhibition
Can battlefield land-robots be made to obey the rules of war?

Can war be fought by lots of well-behaved machines, making it "safer for humans"? That is the seductive vision, and hope, of those manufacturing and researching the future of military robotics.

With 8,000 robots already in use, they believe they can bring about a military revolution.

Most of the robots currently deployed on land deal with non-combat tasks such as bomb disposal - unlike lethal aerial drones.

But Bob Quinn, who works for the US subsidiary of the British robot manufacturer QinetiQ, says the future promises more armed robots on the battlefield, including driverless vehicles.

"The closer you are to being shot, the more you understand the value of having a remote weapons capability," he says.

LISTEN TO THE PROGRAMME
Robo Wars is on Radio 4 on Monday 8 February at 2000 GMT
Or listen via the BBC iPlayer

Anyone who has seen the Terminator films may find this vision scary. Quinn admits that, even among senior military figures, "science fiction movies caused a great deal of angst".

He stresses the need to make sure "that the weaponised robots only operate under the control of the soldier and never independently".

But the speed of modern warfare can make direct human control difficult, says Peter Singer, author of Wired for War.

Take the automated counter-artillery system deployed in Afghanistan.

"The human reaction time when there's an incoming canon shell is basically we can get to mid-curse word… [This] system reacts and shoots it down in mid-air. We are in the loop. We can turn the system off, we can turn it on, but our power really isn't true decision-making power. It's veto power now," Singer says.

Vegetarian vehicles

But if automated systems are taking decisions, how can we be sure they are hitting the right targets and obeying the laws of war?

US academic Patrick Lin was recently commissioned by the US military to study robot ethics.

QinetiQ Talon robot
QinetiQ's Talon robots are used to counter improvised explosive devices

"When you talk about autonomous robots," he argues, "a natural response might be to programme them to be ethical. Isn't that what we do with our computers?"

A striking example of a robot in need of careful programming is a driverless vehicle developed by the Pentagon, called the EATR.

It can refuel itself on long journeys by scavenging for organic material - which raises the haunting spectre of a machine consuming corpses on the battlefield.

Its inventor, Dr Robert Finkelstein of Robotic Technology Inc, insists it will consume "organic material but mostly vegetarian."

"The robot can only do what it's programmed to do, it has a menu," he adds.

All this worries sceptics like Professor Noel Sharkey, co-founder of the International Committee for Robot Arms Control.

If there's an area of fighting that's so intense that you can assume that anyone there is a combatant, then unleash the robots
Dr Patrick Lin, California Polytechnic

"You could train it all you want, give it all the ethical rules in the world. If the input to it isn't correct, it's no good whatsoever," he says. "Humans can be held accountable, machines can't."

If you cannot rely on a robot knowing what to target or distinguishing between enemy forces and innocent non-combatants, Patrick Lin suggests another solution.

"If there's an area of fighting that's so intense that you can assume that anyone there is a combatant," he argues, "then unleash the robots in that kind of scenario. Some people call that a kill box. Any target [in a kill box] is assumed to be a legitimate target."

No emotions

Other researchers suggest robots may avoid the faults of human soldiers.

"Robots that are programmed properly are less likely to make errors and kill non-combatants, innocent people, because they're not emotional, they won't be afraid, act irresponsibly in some situations," says Robert Finkelstein.

But Christopher Coker of the London School of Economics, an observer of wars past and present, disagrees.

Let's keep our guys safe, and kill the enemy
Bob Quinn

"We should put our trust in the human factor," he says.

"Unfortunately the military in their reports often see the human factor as what they call the weakest link. I don't think it's the weakest link. I think it's the strongest link."

Computers will never be able to simulate the "warrior ethos", the mindset and ethical outlook of the professional soldier, he says.

The military revolution in robotics has already advanced rapidly in the air, where remotely piloted drone aircraft are now central to conflicts such as Afghanistan.

On the ground, use of robots has so far been more limited.

Yet given the political and popular concern about casualties among Nato forces, robot manufacturer Bob Quinn's sales pitch is likely to be persuasive.

"Let's keep our guys safe, and kill the enemy. Unfortunately, in warfare that's the situation you're in."

Source

No comments:

Post a Comment