When I was a kid I played a game called “Robots” with my little sister. In the game I chased my sister while pretending to be a robot, repeating the words “you cannot destroy me” in a robotic voice. Yes, it was a rather lame game, but it now appears that the South Korean Government is taking steps to stop this scenario from playing out in real life.
A 12-member task force, consisting of top lawyers, doctors and scientists, has been set up by the South Korean Commerce Ministry to develop a code of ethics for robots by the end of the year.
“We expect the day will soon come when intellectual robots can act upon their own decisions. So we are presenting this as an ethical guideline on the role and ability of robots,” said South Korea’s Commerce Ministry.
Commerce Ministry officials also admitted that the task force may borrow some ideas from Isaac Asimov’s laws of robotics, which are:
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
The South Korean Government predicts that there will be a robot in every South Korean household within ten years. Like many developed countries South Korea is facing an aging population, and the idea is that these robots will help with household chores and generally make life more comfortable for the old. With fewer young people around, robots may also be used to patrol the border with troublesome North Korea, and that’s where things start to get scary.
Hasn’t the South Korean Government seen The Matrix or The Terminator? Doesn’t it know that it’s not a good idea to give a robot a brain and a gun?
If South Korea insists on creating an army of robot warriors, I just hope that the section in the code of ethics about “not exterminating humans” is well and truly stamped on every robot soldier’s brain.