home flickr
Your Ad Here

Tuesday, December 02, 2008

Robot soldiers to get a kinder software side


WASHINGTON: The US military is planning to build robot soldiers that would not be able to commit war crimes.

The army and the navy have hired experts in the ethics of building machines to prevent the creation of an amoral Terminator-style killing machine. By 2010 the US will have spent $4 billion ($6.1 billion) on research into "autonomous systems", the military jargon for robots, on the basis that they would not succumb to fear or the desire for vengeance that afflicts front-line soldiers.

A British robotics expert has been recruited by the navy to advise on building robots that would not violate the Geneva Conventions.

Colin Allen, a scientific philosopher at Indiana University, has just published a book summarising his views titled Moral Machines: Teaching Robots Right From Wrong.

He said: "The question they want answered is whether we can build automated weapons that would conform to the laws of war. Can we use ethical theory to help design these machines?"

Pentagon chiefs are concerned by studies of combat stress in Iraq that showed that a high proportion of front-line troops supported torture and retribution against enemy combatants. Ronald Arkin, a computer scientist at Georgia Tech university, who is working on software for the army, has written a report that concludes that robots, while not "perfectly ethical in the battlefield" can "perform more ethically than human soldiers".

He said that robots "do not need to protect themselves" and "they can be designed without emotions that cloud their judgment or result in anger and frustration with ongoing battlefield events".

Airborne drones are already used in Iraq and Afghanistan to launch air strikes and robotic vehicles are used to disable roadside bombs and other improvised explosive devices.

But this generation of robots are all remotely operated by humans. Researchers are now working on "soldier bots", which would be able to identify targets and distinguish between enemy forces and soft targets, like ambulances or civilians. Their software would be embedded with rules of engagement.

Dr Allen applauded the decision. "It's time we started thinking about the issues of how to take ethical theory and build it into the software that will ensure robots act correctly rather than wait until it's too late."

.. from smh

No comments:

Odd Search