In 1966, MIT computer science professor Josef Weisenbaum wrote a very simple computer program named ELIZA. ELIZA was designed to mimic an empathetic psychologist, mirroring back key words to users in the form of questions, encouraging them to go deeper with their emotions. For example, if the user mentioned, in passing, that they were feeling a bit depressed, ELIZA would ask them why they were depressed. If the user mentioned a family member or significant other, ELIZA would ask them to elaborate on that particular person.

Weizenbaum intended ELIZA as a very rudimentary experiment in artificial intelligence, but was shocked to find out that its users quickly took it quite seriously. Even people who knew that it was a computer program, indeed even people who had helped write the program, began spending hours and hours with ELIZA, sharing their deepest desires and emotions. Word got out, and ELIZA became something of a sensation – the “future of therapy”, according to many experts. Here was a very simple, very inexpensive way to get people the help they so desperately craved.

Then, just at the moment it was poised to explode, Weizenbaum abandoned the project. He cut off all funding, shut ELIZA down, and spent the rest of his life fighting the very thing that the project was meant to advance – the quest for artificial intelligence. In an interview given years later, he explained why ELIZA, or more specifically people’s reaction to ELIZA, had so disturbed him. “It’s all a lie” he said. People thought (or chose to think) that there was a genuinely caring presence on the other end of the conversation, and to Weizenbaum, this illusion, this falsehood, debased the whole enterprise.

A few comments on this tale, the full version of which can be found in the “Talking to Machines” episode of the amazing audio show Radiolab:

First off, if you ever doubted people’s need to be heard, to find a space of non-judgment in which they can vocalize their hopes, fears, dream, anxieties, insecurities, etc, etc, doubt no more. If people are willing to pour our their hearts to a computer program, even when they wrote that program, imagine the impact that one human being can have on another by simply listening. Listening in and of itself, apart from any agenda to change, advise or even comfort, is an act of tremendous grace and power and succor.

Second, I want to handle Weisenbaum’s accusation that ELIZA was a lie, specifically in light of the Christian idea of imputation. He was right, of course, but the question is, does it matter? The core message of Christianity is that God imputes righteousness to us on account of Jesus. God treats us as something that we are not, He sees us as other than we are. He counts us, reckons us, as  good, even though we are evil. Jesus loves the unlovely and calls sinners saints. This is the foolishness of the cross, of Christianity, and it is the truth into which ELIZA’s users were tapping when they latched onto something, anything, that would treat them as if they were worthy of being heard. ELIZA’s users needed to be loved, as we all do, in spite of their unworthiness of that love.

ELIZA was a lie, but it was a true lie, a needful lie, a gracious lie. The only hope that humanity has is the God who loves us enough to lie, to treat us as better than we are or deserve, a God who “justifies the wicked” and “calls into existence the things that do not exist” (Rom 4.5&17).