Isaac Asimov, a prolific American writer of science fiction whose background was as a professor of biochemistry, saw technophobia - a fear of technology - as an understandable but regretable suspicion of whatever was new. He himself harboured an optimistic faith in technological progress, and admitted that he was a technophile. He variously defined one kind of technophobia as 'The Frankenstein Complex', by which he referred to 'the unreasoning human fear of robots' and - since robots are computer-controlled - of computers (Asimov, Warrick & Greenberg 1985, pp. 1, 6). Asimov did not have this fear and disapproved of those who did.
He argued that the story of Frankenstein - both Mary Shelley's novel (1818) and subsequent film and stage versions - has had an enormous influence on the popular imagination. This was partly because the story articulated a deep-seated human fear of dangerous knowledge. Indeed, Mary Shelley explicitly declared: 'how dangerous is the acquirement of knowledge' (in Kranzberg & Davenport 1972, p. 227).
It is worth noting that in Mary Shelley's novel, Frankenstein's technological creation does not run amok because it is intrinsically evil; it is Dr Frankenstein who eventually see his act of creation as evil. The popular film versions of the Frankenstein story encourage us to see the creature as intrinsically evil, and many later literary creations are clearly meant to be malevolent. But in the novel Frankenstein's creature only becomes malignant and uncontrollable because it was spurned by an egotistical and irresponsible maker who had given no thought to the consequences of his actions. Langdon Winner offers a basic plot summary of the Hollywood versions of Frankenstein:
Such a plot has little in common with the novel. In the original novel there was no demented assistant, no criminal brain, no random rampage and no final extermination of the nameless creature to bring us reassurance. And we are never presented with that key feature of film versions, Dr Frankenstein in his laboratory throwing a switch.
Mary Shelley's novel is a serious treatment of the themes of creation, responsibility, neglect and its consequences. In the novel, Victor Frankenstein was driven by curiosity about the natural world, turning first to alchemy and the occult and later to to modern science, until he falls upon the secret of creating life. One evening he succeeds in bringing an artificial man to life. When it opens its eyes and begins to breathe, Frankenstein is filled with misgivings about what he has done, and rushes away, leaving the creature to cope alone. Even when the creature finds his way to Frankenstein and tries to speak to him, Frankenstein flees again. He does not see the creature for two years. So in the novel it is Frankenstein who flees, not the creature. It is a flight from responsibility. The creature is left to make his own way in the world: travelling alone, it learns to speak and write, and eventually confronts Victor Frankenstein.
In the novel the creature says:
Having experienced rejection by both Frankenstein and those he had met the creature explains that he wants to be made part of the human community. Frankenstein initially tries to send the creature away again, but it reasons with him, and in this interchange it is the creature who seems most human. An agreement is reached that it is probably too late for the creature to join human society, but as a compromise Victor promises to create a female partner for the creature, so that they might dwell together in some place remote from humankind. Thus Victor decides that the problems caused by technology require a 'technical fix'.
After a lot of procrastination Frankenstein sets to work on the new creature. However, he suddenly fears that she might not be so compliant, and that the two creatures might give birth to others. Declaring a sense of responsibility to humankind, Victor violently destroys his unfinished creation. He does this before the eyes of the other creature. His existing creature then acts out of despair and revenge and destroys Frankenstein's best friend and Frankenstein's new bride. Frankenstein pursues the creature, but dies before reaching him. Finally, the creature seeks only suicide. All this - written by a nineteen-year-old - is a far cry from Boris Karloff's rather more familiar film portrayal of the creature as a threatening mechanical automaton. And its most distinctive theme would seem to be the need for responsibility: for thinking about the likely consequences of the artifacts we create, and for doing something about them afterwards. It takes a disaster before Victor accepts the responsibility as his own, by when it is too late (Winner 1977, p. 343).
Another moral of the tale is that there are some things that we are not meant to know, some things that should be left alone. Mary Shelley herself wrote that 'supremely frightful would be the effect of any human endeavour to mock the stupendous mechanism of the Creator of the world' (in Warrick 1980, p. 36). Only God could create a living soul. Soulless intelligence is evil, and those who seek to rival God as creator deserve to die. Frankenstein was another Faust and the tale was similarly one of Mephistophelean nemesis. It was to foreshadow countless popular stories in which humans created artificial beings who then destroyed their creators.
But Frankenstein's creation was the first artificial being to be an appliance of science. Dr Frankenstein was a scientist (although the word didn't exist in 1816), and the story as developed by Hollywood featured the theme of the mad scientist, which was later to be linked with a lust for power. We fear those who seem to play with dangerous knowledge.
We fear knowing too much, and may perhaps even fear intelligence. Knowledge carries risks. We need only recall the Christian story of the Original Sin of eating of the tree of knowledge. The message to man in Genesis is that 'Of the tree of knowledge of good and evil, thou shalt not eat of it: for in the day that thou eatest thereof thou shalt surely die.' There are many legends of the destruction of those who saw what they should not have seen. And as the popular saying has it, 'If God had meant us to fly he would have given us wings'.
Zeus had withheld fire from man, but Prometheus, benefactor of mankind, defied Zeus by bringing fire to man as a gift. Gaston Bachelard interpreted fire as symbolizing the imagination. Through the fire of the imagination man attempts to ascend to the world from which he fell. Bachelard argued that man is driven by a need to understand which he called 'the Prometheus complex'. He saw this as the Oedipus complex of the intellect: arguing that men want to know as much or more than their fathers (1938, cited in Warrick 1980, pp. 28-9).
Asimov's reaction to the fear of knowledge is that:
Knowledge has its dangers, yes, but is the response to be a retreat from knowledge? Are we prepared then to return to the ape and forfeit the very essence of humanity? Or is knowledge to be used as itself as a barrier against the danger it beings? In other words, Faust must indeed face Mephistopheles, but Faust does not have to be defeated!
Knives are manufactured with hilts so that they may be grasped safely, stairs possess banisters, electric wiring is insulated, pressure cookers have safety valves - in every artifact, thought is put into minimizing danger... Consider a robot, then, as simply another artifact. It is not a sacrilegious invasion of the domain of the Almighty, any more (or any less) than any other artifact is. (Asimov 1968b, p. 13-14)
Asimov notes that 'Beginning in 1939, I wrote a series of influential robot stories that self-consciously combated the 'Frankenstein complex' and made of the robots the servants, friends and allies of humanity' (Asimov 1981, p. 137). His robots were designed with safety features built into their programming. He called these 'The Three Laws of Robotics', and they first appeared explicitly in a story in 1942.
The Laws reflect Asimov's belief that because technology can be ethically and responsibly used, there is no reason to fear it. He believed that giving to machines labour that was dehumanizing and degrading allowed humans to become more human. And letting computers do repetitive and dull mental tasks freed humans to undertake creative work (Warrick, in Asimov, Warrick & Greenberg 1985, p. 209, and Warrick 1980, p. 55).