When Isaac Asimove first specified the Three Laws of Robotics back in 1942 during the height of World War II which states as follows:
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
At that time he was probably trying to write a science fiction story which would attract the readers but sometimes you have to wonder whether he was trying to tell the world that humans also need a set of laws for humanity in relation to the world as nations fought against people like Hitler.
Now, once again the world faces a modern day Hitler, in the guise of an Asian despot called Than Shwe (nicknamed Golden Flea, a pun on his name). After the devastation left by cyclone Nargis nearly two weeks ago, the world is still deadlocked on how to deal with the military junta in Burma which is denying access to aid relief workers who can efficiently coordinate and help the victims of the cyclone. Several nations have asked the UN to implement the Resolution Responsibility to Protect while other nations oppose it stating that it is a violation of the sovereignty of a nation. And still very little aid is reaching the victims, while the world argues, who are dying from disease and hunger after being weakened drastically by Nargis.
The world should take an example from Asimov's three laws and create their own three laws which apply to humanity as a whole. And the first of all the laws should be:
1. The World should not harm innocents or, through inaction, allow innocents to be harmed.
No comments:
Post a Comment