Does not compute

From Wikipedia, the free encyclopedia

Does not compute, and variations on it, was a phrase often spoken by computers, robots and other artificial intelligences in science fiction works of the 1960s to 1980s. The phrase indicated cognitive dissonance on the part of the device, conventionally leading to its self-destruction.

According to The Random House Historical Dictionary of American Slang, the phrase was first used as a catch phrase by the television show My Living Doll in 1964. It was then popularised in Lost In Space (1965), along with "Affirmative!", "Warning! Warning!" and, of course, "Danger, Will Robinson!"

Spoiler warning: Plot and/or ending details follow.

The phrase was often present in stories which carried a theme of the superiority of human emotion over cold machine logic. Despite computers' superior ability at calculation and information processing, their lack of emotion or randomness made them unable to resolve cognitive dissonance, which they expressed with the phrase "Does not compute." It was usually the computer's response to information which it had received but could not reconcile with other information it already held to be true. Further, it could also be seen as a depiction of the stagnant nature of the machine's programing as fundamentally flawed; due to its preprogramed nature, it would be unable to adapt itself to circumstances beyond the scope of its programing, as opposed to humans who, due to these limitations not being present in the human mind, could adapt to such unforseen events. In some cases, such as in the finale to Logan's Run, and in several episodes of the original series of Star Trek (e.g. "I, Mudd", "Requiem for Methuselah", "The Return of the Archons" and "The Changeling"), presenting a computer or robot with such a contradiction caused it to violently self-destruct.

Such depictions were a reflection of the state of real computers at the time, which usually did not have enough resources to provide a friendly user interface. It was not uncommon for a computer to respond to bad input with an error message on the same order of uselessness as "does not compute" (although self-destruction was unlikely, but see also killer poke and Halt and Catch Fire).

Although not using the phrase "does not compute", the short story "Liar!" (1941) by Isaac Asimov is a striking early example of cognitive dissonance leading to a robot's self-destruction: that whether it lies, tells the truth or says nothing, it will cause humans injury, so being unable to avoid breaking Asimov's First Law Of Robotics: "A robot may not harm a human being, or, through inaction, allow a human being to come to harm." This example is a more sophisticated treatment of cognitive dissonance leading to self-destruction than most examples from later television science fiction. Asimov explored the theme of AI cognitive dissonance at length in his robot stories.

By the 1990s, with the rise of personal computers and the graphical user interface, the public conception of computers became more friendly and sophisticated, and the image of the computer intelligence unable to respond gracefully to unexpected inputs has gradually faded away from fiction, though the phrase did show up in Star Wars Episode I: The Phantom Menace in 1999.

The phrase "does not compute" and robots who self-destruct when considering emotions is frequently satirized in popular culture, such as the Futurama episode in which Leela's attempt to thwart Robot Santa with a paradox was stopped by his "paradox-absorbing crumple zones". In addition, the robot character Bender once referred to the phrase as "an old robot saying".

[edit] References

  • Does not compute (Jesse Sheidlower, American Dialect Society mailing list, 2001-09-15) — cites The Random House Historical Dictionary of American Slang