I, Robot - Teachers Guide
I, Robot - Teachers Guide
I, Robot
by Isaac Asimov
plot summary
“Introduction”
2057: Earth. An unnamed reporter for “Interplanetary Press” prepares to interview Susan Calvin, a seventy-five-year-old
“Robopsychologist” who works for U.S. Robot and Mechanical Men, Inc., (U.S. Robots). Accused of being emotionless like
a robot, Calvin argues that robots are more than mechanical parts, “They’re a cleaner better breed than we [humans] are.”
Calvin reminisces about early opposition to robots from labor unions (that were worried about competition) and religious
groups (that were worried about sacrilege). Against these anti-robot arguments she holds the memory of an early robot
model named Robbie, which was sold in 1996 as a nursemaid for a little girl. Calvin begins to tell the story.
“Robbie”
1998: Earth. Robbie the Robot plays hide-and-go-seek outdoors with his charge, nine-year-old Gloria Weston. Robbie
lets her win. Gloria is a demanding but charming girl who loves Robbie. At Robbie’s gestured urging (he cannot speak),
she begins to tell him his favorite story, Cinderella, but her mother, Grace, calls them to come inside. Grace does not trust
Robbie with her daughter and badgers her husband, George, to get rid of him until George finally
gives in. Gloria is heartbroken when her parents take away Robbie. In an attempt to distract her,
her parents decide to take Gloria on a trip to New York, hoping the excitement of the city will take
her mind off Robbie. While in New York, they tour the U.S. Robots factory. Gloria spies Robbie, one
of “the robots creating more robots.” She runs toward him–right into the path of a moving tractor.
Before anyone else has time to react, Robbie snatches Gloria out of harm’s way. Because Robbie
has saved Gloria’s life, her mother grudgingly allows the robot to return to the family. At this point,
the frame story (Susan Calvin talking to the reporter) resumes. Calvin tells us that robots were
banned from Earth between 2003 and 2007. To ensure the company’s survival, U.S. Robots started
developing mining models for other planets. Calvin recalls two troubleshooters, Mike Donovan and
Gregory Powell, who worked with the experimental designs in “the teens.”
“Runaround”
2015: Mercury. Gregory Powell and Mike Donovan have sent robot SPD 13, “Speedy,” on a quest
for selenium, a necessary ingredient for their life support machinery. Selenium is somewhat
dangerous to Speedy, and when the robot doesn’t return, Powell and Donovan decide they must go
retrieve him from the surface. They find Speedy circling the selenium pool and gibbering. Speedy
has gone crazy because two of the fundamental laws of robotics have come into conflict. Powell
ordered Speedy to get the selenium (Second Law: always obey human orders). But since Powell
wasn’t very insistent, the Second Law didn’t quite overwhelm the Third Law (self-protection).
Caught between conflicting directives, Speedy hovers around the selenium pool, not quite able to
get close enough to harm himself, but not able to leave the site because he has been ordered to go
to the pool. To remove Speedy’s conflict, Powell walks towards Speedy, purposely going too far from
safety for him to be able to return without Speedy’s help. Speedy sees him, causing the First Law
to kick in (do not harm or allow a human to come to harm through inaction). Speedy saves Powell
and they send the robot back for the selenium, this time installing in Speedy Second Law orders
firm enough to counteract any Third Law thoughts of self-preservation. Speedy returns with the
selenium and the pair anticipate their next work assignment at the space stations.
“Reason”
2015: Space Station. QT-1 (Cutie), a new model, refuses to believe that inferior humans created
superior robots. Cutie decides that an Energy Converter has created robots, and that he is its
Prophet. Cutie’s religion spreads to the other robots: they obey Cutie, but not Powell or Donovan.
Meanwhile, a potentially dangerous electron storm is approaching. Widespread destruction on
earth could occur if the storm is able to throw out of focus the energy beam sent from the station
to earth. Cutie will not let Powell and Donovan make adjustments. To Cutie, humans are obsolete.
Donovan and Powell try to argue with Cutie, but all their attempts fail. They then try to prove to
Cutie that humans build robots by building a robot themselves. Cutie argues that the pair only
assembled the robot; they did not create it. The electron storm comes and, luckily, Cutie keeps
the beam focused because he believes he serves the Converter by keeping its instrumentation in
balance (i.e., in focus). Powell points out that the Second Law (always obey human orders) requires
Cutie to obey. No matter what robots believe to be the ultimate source of command, they will still
do their duties.
2
Teacher’s Guide: I, Robot
“Liar!”
2021: Earth. U.S. Robots accidentally creates RB-34 (Herbie), a robot that can read minds. Susan
Calvin, Alfred Lanning, Milton Ashe, and Peter Bogert are assigned to find out how the mind
reading has changed the robot: “what harm it has done to his ordinary RB [robot] properties.”
Herbie has figured out that Calvin loves Ashe but does not feel worthy of being loved in return.
Herbie assures her that Ashe loves her and that a woman Ashe brought to visit was only a cousin.
When Bogert consults Herbie, the robot tells him that the director, Lanning, has retired and has
put Bogert in charge. When Lanning questions some of Bogert’s calculations, Bogert informs an
incredulous Lanning that he is no longer the boss. Later, during a conversation with Ashe, Calvin
finds out that the girl isn’t Ashe’s cousin but his fiancée. Calvin realizes that Herbie has been lying
to them because it was following the First Law of Robotics (do not harm a human). By telling each
person what s/he wanted to hear, the robot was trying not to hurt the humans. Calvin asks the
robot what went wrong in its assembly that made it able to read minds. This throws the robot into
an impossible conflict. The robot can’t answer because it thinks it will make Lanning and Ashe feel
bad to know that a robot figured out something that the scientists couldn’t. On the other hand,
it must answer because Lanning and Ashe want to know the answer (Second Law: obey human
commands). Since either action will cause harm to humans, the robot collapses. The frame story
resumes again with Calvin sitting behind her desk with her face “white and cold.”
“Escape!”
2030: Earth. A competing robot company, Consolidated Robots, asks U.S. Robots to solve a
problem that fried their own “Super-Thinker.” The problem is how to build a hyperspace drive for
humans. Susan Calvin thinks that the reason Consolidated is having problems is because building
the hyperspace drive involves harm to humans, it brings the First Law (do not harm humans)
and Second Law (obey human orders) into conflict. U.S. Robot’s own super-thinker, “The Brain,”
however, is equipped with a personality. Calvin thinks that The Brain will be able to handle the
dilemma because having a personality–emotional circuitry–makes it more “resilient.” But when the
scientists feed the problem to The Brain, it doesn’t even acknowledge the existence of a problem,
and promises to build the ship. The story jumps ahead to Powell and Donovan inspecting the ship
two months later. While inside, the ship takes off and as the ship makes an interstellar jump, each
man has a near-death experience. The men return from beyond the galaxy and Calvin learns that,
during their time in hyperspace, the two were technically dead (matter turns to energy at light
speed). Why was The Brain able to build the ship if it caused human death? It turns out that Calvin
had adjusted The Brain’s controls to play down the significance of death for the robot. Since death
on the ship was temporary, The Brain, unlike Consolidated’s “Super Thinker,” was able to ignore the
harm aspect (First Law) of the order and build the ship (Second Law).
3
Teacher’s Guide: I, Robot
“Evidence”
2032. In the frame story, Calvin discusses how earth’s political structure changed from individual
nations to large “regions.” She recalls a man, Stephen Byerly, who ran for a mayoralty of one of the
regions. The story begins with Francis Quinn, a politician, trying to convince Lanning, Director of
U.S. Robots, to keep Byerly from political office because Byerly is a robot. Byerly denies this, but
lets Quinn base his campaign on testing whether or not he is a robot. Byerly returns home and tells
John, an old, crippled man who lives with him and who he calls “teacher,” about Quinn’s strategy.
Once informed that Byerly might be a robot, Fundamentalists begin huge protests outside Byerly’s
home. He goes outside to talk to them and a man challenges Byerly to hit him. Byerly obliges
and Calvin pronounces him a human, because the First Law (do not harm a human) would have
stopped him if he were a robot. Later, Calvin reveals to Byerly that she suspects that he really is
a robot. She recalls that a biophysicist named Byerly was horribly crippled in an accident. Calvin
theorizes that the real Byerly is actually the old cripple, “John,” and that he built a new body around
a positronic brain he’d acquired. Byerly doesn’t confess but does admit that he spread the rumor
that if he were really a robot, he couldn’t hit a human being. Calvin suggests to him that the human
Byerly hit wasn’t really a human, but another robot, which let him avoid any conflict with the First
Law. She admits later that she doesn’t know whether or not he really was human.
4
Teacher’s Guide: I, Robot
This “science” part of Asimov’s science fiction shows in his early commitment to make the science
in his stories realistic (or at least plausible). In addition, throughout the 1960s and 1970s he
primarily wrote non-fiction science works that covered a dazzling number of subjects including
Astronomy, Earth Sciences, Physics, and Biology, among others.
Calling Asimov a prolific writer would be an understatement. Through the years he tried his hands
at literary criticism (from Shakespeare to Gilbert and Sullivan), humor (mainly limericks), children’s
literature, autobiography, and editing. In addition, he managed to find time to write histories of
Europe, North America, Greece, Egypt, England, and Earth. He wrote more than 1,600 essays and
published at least 450 books. Famously, Asimov had at least one book published in each of the ten
major Dewey Decimal library classifications. As he said in an interview, “I wrote everything I could
think of.”
Asimov won every major science fiction award during his life. He won seven Hugo awards, the first
in 1963 and the last in 1995. He was also honored with two Nebula awards. In 1986, the Science
Fiction Writers Association named him a Grand Master and eleven years later he was inducted into
the Science Fiction and Fantasy Hall of Fame.
Asimov died in 1992 but his work lives on through new generations of readers, writers, and
scientists. Rather than being outdated, his writing has proved prophetic. Reading his stories about
robots in 1950, we would have thought that his reach exceeded his grasp. As advances in robotics
and brain imaging have brought the idea of a human-like robot closer, we recognize that the day
may come when we just might see Robbie tending to our own children.
5
Teacher’s Guide: I, Robot
encourage your students to find out how many of the three Laws are represented in the
systems they have identified, and in what order. Does the U.S. legal system, for example,
require obedience over self-preservation? If so, why? If not, why not?
11. Through the 1940s, Asimov published each story in I, Robot as an individual story. In 1950, he
collected the stories and published them together as a book. What clues can your students
find in the book that show that the stories have been joined together? Where are the seams?
What techniques did Asimov use to make the stories seem like one whole book? Where does
this reweaving work well? Poorly? Why does it work in some places but not others?
12. This book begins with a story about a robot that is dominated by a little child (“Robbie!”) and
ends with a story in which robots control every facet of human life (“The Evitable Conflict”).
How likely do your students find a situation where humans would give up control of their
worlds to machines? Would we give up the ability to own things? To determine our own
movements? To what degree do they think we already have? What signs are there that our
lives already have become controlled by machines? That we control our machines?
13. Asimov admits in his Memoir that, in his early writing, he was most comfortable with
European-American characters. What signs of discomfort can your students detect when
he writes non-European characters like Ching Hso-lin or Lincoln Ngoma (“The Evitable
Conflict”)? Put another way, would Asimov have written any differently if Hso-lin (or others)
had been Powell or Donovan? For example, would he have noted that Powell spoke in “precise
English” as with Hso-lin, or that Donovan’s English was “colloquial and mouth-filling,” as
with Ngoma?
14. Although most readers focus on the Three Laws of Robotics as the animating principle for
the robot stories, there is another factor at work: emotional attachment. Asimov said, “Back
in 1939, I realized robots were lovable.” What is lovable about the various robots in the stories?
Which one was the most lovable? Why? Which was least lovable? Why? How does Asimov
manage to make a hunk of metal lovable (or unlovable)?
15. How would the collection have changed if it were titled Mind and Iron (as Asimov wanted
to call it originally)? What does the title, I, Robot, communicate that the title, Mind and Iron,
doesn’t? Similarly, how would the first story change if it were titled “Strange Playfellow”
instead of “Robbie?” What does strange playfellow setup that Robbie doesn’t? Come up
with other titles that Asimov might have considered for the individual stories and the whole
collection.
16. I, Robot has been turned into a major motion picture starring Will Smith. How does the
movie compare with your book-reading experience? What do you think of the adjustments
made and liberties taken when converting this collection of stories to one seamless film
adaptation?
suggested activities
1. Have your students invent their own philosophical puzzle involving the Three Laws of Robotics
using Asimov’s human characters, but new robots. They might, for example, imagine a story
where Powell and Donovan meet Star Wars character R2-D2 who is pulled in three directions
by an order to destroy himself, the knowledge that destroying himself will kill a human, and the
knowledge that not destroying himself will kill another human.
2. Asimov was deathly afraid of flying, but many of his stories involve flight across the earth to
other planets and to distant galaxies. Have your students choose something they fear, and
encourage them to write a science fiction story that involves that fear indirectly. For example,
if someone is afraid of heights, s/he might write about a society that lives in the treetops.
If someone is afraid of spiders, s/he might write about a society based on the pattern of
a spider’s web. After they write the story, have them consider how fear factored into their
composition. Did they tend to write less about what they were afraid of? More? Did they write
about their fear less directly? Return to Asimov’s stories and see if you can identify the marks
of fear when he writes about flight. (He was also afraid of other things that they might look for
in a biography, or his memoir.)
3. Let your students pick a story by another author that involves robots and compare it to
Asimov’s. What similar concerns do they have? How human are the robots? What contrasts do
they find between themes that interest Asimov and the other author?
4. What role do machines play in our lives today? Have your students keep a journal that lists
every machine that helps them live their lives. A list might start, for example, with the alarm
6
Teacher’s Guide: I, Robot
clock that wakes them up, the refrigerator that keeps the milk cold, the water heater that keeps
the water hot, the computer that transmits email and stores their homework, the vehicle that
drives them to school, the phones that deliver messages and pictures, and so on. What would
life be like without these machines? In discussion, or writing, have them imagine a world
where one by one, all these machines vanish. How would we eat, communicate, travel, etc.?
Turning what they learn to the past, have your students research the history of a machine that
has become indispensable to us today. What did people do before a particular machine was
invented (e.g., clocks)? What changes happened when the machine was invented? Perfected?
Turning toward the future, ask your students to think of machines that have yet to be invented.
What things will become necessary to future generations that we do not have?