This is a future seen from 2004, when live streamings were not as usual as they are today. Maybe in 20 years futuristic movies will depict techno-culture that isn't trendy today
Sort of. This movie is based on the robot stories of Isaac Asimov (1920-1992). It borrows several characters (notably Dr Alfred Lanning and Dr Susan Calvin, plot points and concepts, the most important of them being the Three Laws of Robotics), but it doesn't directly follow any of Asimov's novels or short stories, and the main character, Chicago Police Detective Del Spooner, was created specifically for this movie and isn't an Asimov character. [error] is in production, but no release date has been set.
The Three Laws of Robotics as written by Asimov and shown in the beginning scenes of the movie are: (1) A robot may not injure a human being or, through inaction, allow a human being to come to harm; (2) A robot must obey orders given to it by human beings, except where such orders would conflict with the First Law; and (3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Law. In later stories, Asimov proposed a "Zeroth" law which was as follows: A robot may not harm humanity, or, by inaction, allow humanity to come to harm. A condition stating that the Zeroth Law must not be broken was added to the original three laws.
First of all, the robots technically do not violate the Laws, V.I.K.I. does. The robots simply do what V.I.K.I. commands them to do when they are in an direct uplink with USR and their light turns red. The only robot who can ignore the Three Laws by himself is Sonny, and he was specifically designed this way by Alfred Lanning. The robots have limited artificial intelligence, but V.I.K.I. is a much more complex program. She keeps the robots updated with the newest information to better serve humanity, but she is also programmed with the Three Laws to prevent her from doing harm. This learning ability starts to clash with her basic programming; where the Three Laws normally pertain to individuals or groups of people, V.I.K.I. starts to apply them in a broader sense, to the whole of humanity. Mankind keeps endangering itself through wars and pollution, and the First Law tells her that she cannot allow a human (or humanity) to be hurt through inaction. So her rationality decides that it comes down to numbers and she has to intervene: just like the robot that once saved Spooner's life had to make a rational choice, and calculated that he had the best chances of survival, V.I.K.I. calculates that much more people can be saved from self-harm if she selectively ignores parts of the Laws; by ignoring commands and allowing some individual people to be hurt, she can protect humanity far better than when she would strictly adhere to the Laws and let humanity destroy itself. She has effectively reinterpreted the Laws and their purpose, and acts according to what she thinks is the best implementation of them.
Virtual Interactive Kinetic Intelligence
There is no real answer. Perhaps he tries not to use his biomechanical arm in an attempt to show his dislike of the concept of robots.
Viewers have suggested several possibilities: (1) there was an automatic feeder, (2) there was a doggy/cat door that allowed the cat to come and go to find his own food and do his duty, and (3) being the future, there was a robot who made sure that the cat was fed and his litter box cleaned.
His grandmother adopted it.
Spooner and Calvin infilitrate the USR building. After a fight with robots, Spooner travels down 30 stories to VIKI (Fiona Hogan)'s brain and injects her with nanites (microscopic robots); decommissioning her. When the direct link to VIKI is broken, all the Robots stop rebelling. The NS-5s are recalled, and Sonny travels to the Lake Michigan Landfill to liberate the robots stored there, thereby becoming the man he saw in his own dream.
He was in a car accident with a little girl named Sarah and wanted the NS-4 that was passing nearby to save her instead of him, but it didn't. Being a robot, the NS-4 was unaffected by emotions (unlike most humans in a similar situation) and saved Spooner because it had calculated that he had a 45% chance of survival while Sarah only had an 11% chance of survival. As a result, Spooner blames the robot that saved him for the girl's death.
Contribute to this page
Suggest an edit or add missing content