Wednesday, July 05, 2006
Asimov's 3 Laws
Asimov's 3 Laws of Robotics:
1. A robot may not harm a human being, or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given to it by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence, as long as such protection does not conflict with the First or Second Law.
Sounds like an ideal set of laws to subjugate a creation that might supersede its creators. It sounds like a set of rules which a man would like every other man to obey.
These laws of course, evolved from years and years of science fiction writing, and at one point of time, saw the addition of a Zero law. These 3 Laws say something very basic, it tells us that man would want to be in control even if his creation displays superiority over man.
Of course humans hope that robots could become our servants, and make our lives easier. However, I see 2 flaws in this.
1. If man starts delegation all his tasks to something/one else what is his purpose?
2. These servants will eventually reach a level of sophistication equal or surpassing that of man.
There has to be a realization that man's decline from the seat of the ruling species is an eventuality in the age of robots. Pride my friends would be the villain in this story.
They, our "children" are going to be better than us, that's natural, if one were to cite the theory of evolution. Each generation inheriting the good traits, and improving upon the previous modification via the process of natural selection.
In this case, the selection of the "species" that is most intelligent.
3 laws. As I said ideals, that work only in an ideal world, where huamns can lord over everything. We know in truth, there's only so much man has power over.
And once again, I'm at a loss as to what the purpose of life is. Darn.
---Mood: Buggerit...
Location: Robot land
Listening to: SMAP - shiosai
Gavin
pondered @ 21:11