"Ethics Dilemma in Killer Bots"
The Australian (01/16/07) P. 29; Argy, Philip
Guard robots being deployed on the northern border of South Korea are
capable of firing on human targets without receiving any direct commands
from humans, and have brought up many important ethical questions. Each
Intelligent Surveillance and Security Guard Robot will be equipped with
a daylight camera capable of identifying targets within a 4-kilometer
radius, and an infra-red night vision camera that has a range of 2
kilometers.
While humans can use a joystick and touchscreen to control the robots,
they are programmed to respond autonomously when an intruder does not
provide a correct password. The robot's responses include sounding an
alarm, using non-lethal force, or firing a machine gun or rifle; these
would be the world's first robots with such capabilities.
While the manufacturer says the robots are superior to human guards
because they are immune to weather conditions and fatigue, many point
out that a human soldier could utilize discretion and understand the
consequences of his actions. Australian Computer Society's Mike Bowern
expresses concerns over the potential for "software and hardware
defects" to "influence the robot's conduct." He also points out that
little is known of the ethical considerations taken by the robots'
designers, or any code they must follow, since Korea doesn't have an
independent professional association such as the ACM or the ACS, and the
Korean Ministry of Information and Communication seems to place greater
importance on technical aspects than it does on professional or ethical
concerns.
Many worry that these robots could eventually be sold to private
customers. Computer ethicist James Moor points out the robots could not
be held legally or morally responsible for their actions, leaving such
responsibility up to technology professionals.
<http://australianit.news.com.au/articles/0,7204,21064361%5E15309%5E%5Enbv%5E,00.html>
Show replies by date