Every good geek learns, at some point in his or her Jedi training, Asimov's Three Laws of Robotics.
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey orders given to it by human beings, except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
It's also common knowledge that Asimov's stories often served as "proving grounds" for the laws (there's a longish and reasonably geeky discussion of the three laws at Wikipedia that examines these trials of the laws). He experimented with various augmentations of and alterations to the structure of the laws over the years.
It's a safe bet, though, that he never envisioned any laws like the ones put forward by the Something Awful crew today.
In all, the SA goons have come up with not three, but thirty laws of robotics. Asimov's spinning, but frankly, my sides are splitting. Anyone up for a competition? Let's hear it: What other laws of robotics did Asimov "forget"?