As computers have become more capable, the Eliza effect has only grown stronger. Take the way many people relate to ChatGPT. Inside the chatbot is a “large language model”, a mathematical system that is trained to predict the next string of characters, words, or sentences in a sequence. What distinguishes ChatGPT is not only the complexity of the large language model that underlies it, but its eerily conversational voice. As Colin Fraser, a data scientist at Meta, has put it, the application is “designed to trick you, to make you think you’re talking to someone who’s not actually there”.
But the Eliza effect is far from the only reason to return to Weizenbaum. His experience with the software was the beginning of a remarkable journey. As an MIT professor with a prestigious career, he was, in his words, a “high priest, if not a bishop, in the cathedral to modern science”. But by the 1970s, Joseph Weizenbaum had become a heretic, publishing articles and books that condemned the worldview of his colleagues and warned of the dangers posed by their work. Artificial intelligence, he came to believe, was an “index of the insanity of our world.”
Today, the view that artificial intelligence poses some kind of threat is no longer a minority position among those working on it. There are different opinions on which risks we should be most worried about, but many prominent researchers, from Timnit Gebru to Geoffrey Hinton – both ex-Google computer scientists – share the basic view that the technology can be toxic. Weizenbaum’s pessimism made him a lonely figure among computer scientists during the last three decades of his life; he would be less lonely in 2023.
There is so much in Weizenbaum’s thinking that is urgently relevant now. Perhaps his most fundamental heresy was the belief that the computer revolution, which Weizenbaum not only lived through but centrally participated in, was actually a counter-revolution. It strengthened repressive power structures instead of upending them. It constricted rather than enlarged our humanity, prompting people to think of themselves as little more than machines. By ceding so many decisions to computers, he thought, we had created a world that was more unequal and less rational, in which the richness of human reason had been flattened into the senseless routines of code.
Weizenbaum liked to say that every person is the product of a particular history. His ideas bear the imprint of his own particular history, which was shaped above all by the atrocities of the 20th century and the demands of his personal demons. Computers came naturally to him. The hard part, he said, was life.