Tuesday, June 4, 2013

Remember...robots feel no pain and are stupid


io9...

Dubbed “doodlebugs” by the Allies, the Goliath was a remote-controlled demolition carrier.

It was introduced by the Germans in 1942 who used the device to transport a 165 pound bomb to a target, which typically included tanks, dense infantry formations, bridges, and buildings. These vehicles were wire-controlled and exploded on impact with their targets. Over 4,600 of these devices were produced, including a larger version that could carry a 220 pound bomb. Unfortunately for the Germans, they were slow, hard to control, and the payloads far too small. The idea was clearly ahead of its time — a kind of precursor to modern robots — but the technology was simply not advanced enough.


Now there is this...



The United States, Britain, Israel, and South Korea have already deployed robot sentries, which are among technologies seen as precursors to fully autonomous systems.

"UN official calls for ban on ‘mechanical slaughter’"

by

Nick Cumming-Bruce

May. 30th 2013

THE GLOBE AND MAIL

A UN expert called for a global moratorium on the development and use of armed robots that can select and kill targets without human command.

“War without reflection is mechanical slaughter,” said Christof Heyns, the UN special rapporteur on extrajudicial, summary or arbitrary executions.

“A decision to allow machines to be deployed to kill human beings worldwide – whatever weapons they use – deserves a collective pause,” he told the Human Rights Council in Geneva on Thursday.

No countries use such weapons, but the technology is available or soon will be, Mr. Heyns told the Council.

The United States, Britain, Israel, and South Korea have already deployed robot sentries, which are among technologies seen as precursors to fully autonomous systems.

Mr. Heyns urged the council to set up a high-level panel to report within a year on advances in the development of “lethal autonomous robotics,” to assess whether existing international laws are adequate for controlling their use.

Preparations to introduce armed robots raise “far-reaching concerns about the protection of life during war and peace. This includes questions of whether robots will make it easier for states to go to war,” Mr. Heyns said.

Some states active in developing such weapons have committed to not deploy them for the foreseeable future, Mr. Heyns acknowledged, but “it is clear that very strong forces – including technology and budgets – are pushing in the opposite direction,” he said.

His initiative comes as non-governmental organizations and human-rights groups are campaigning to ban fully autonomous weapons to pre-empt deployment in the same way as the ban on laser weapons. Discussions are under way with a number of governments that may be willing to take the lead in drafting a treaty to outlaw the weapons, Steve Goose, arms division director of Human Rights Watch, told journalists in Geneva this week.

Supporters of the robots say they offer a number of advantages: They think faster than humans, they are not subject to fear, panic, a desire for revenge or other emotions that can cloud human judgment. A report by Human Rights Watch and the Harvard Law School cites a U.S. Air Force assessment that “by 2030 machine capabilities will have increased to the point that humans have become the weakest component in a wide array of systems and processes.”

Human-rights groups dispute the ability of robots to meet requirements of international law, including the ability to distinguish between civilians and combatants or to assess proportionality – whether the likely harm to civilians during a military action exceeds the military advantage gained by it. Moreover, in the event a killer robot breaches international laws, it is unclear who could be held responsible or punished.

“It is possible to halt the slide toward full autonomy in weaponry before moral and legal boundaries are crossed but only if we start to draw the line now,” Mr. Goose of Human Rights Watch said in a statement this week.


Thanks to Lance.

No comments:

Post a Comment