Skip to main content
Skip to content
Case File
d-28744House OversightOther

Historical description of ELIZA chatbot and early AI conversation

The passage merely recounts the development of the ELIZA program in the 1960s and includes a sample dialogue. It contains no allegations, financial flows, or connections to current powerful actors, ma Describes Joseph Weizenbaum's ELIZA program and its psychological basis. Provides a sample 1965 conversation illustrating ELIZA's questioning style. Notes the program's spread across academic institu

Date
November 11, 2025
Source
House Oversight
Reference
House Oversight #018353
Pages
1
Persons
0
Integrity
No Hash Available

Summary

The passage merely recounts the development of the ELIZA program in the 1960s and includes a sample dialogue. It contains no allegations, financial flows, or connections to current powerful actors, ma Describes Joseph Weizenbaum's ELIZA program and its psychological basis. Provides a sample 1965 conversation illustrating ELIZA's questioning style. Notes the program's spread across academic institu

Tags

historyelizahouse-oversightartificial-intelligencejoseph-weizenbaumtechnology

Ask AI About This Document

0Share
PostReddit

Extracted Text (OCR)

EFTA Disclosure
Text extracted via OCR from the original document. May contain errors from the scanning process.
system that could, in a very limited sense, converse. The computer would take typed input from a user and then, based on what it understood, ask a question back to continue the conversation. The approach was borrowed from the psychological theories of Carl Rogers, an American therapist who believed the best way to talk to patients was with open ended, careful questions. Weizenbaum developed two programs. The first was called DOCTOR, a script that determined what the computer might say. The natural language processor responsible for carrying on the human side of the conversation he called ELIZA, after the heroine from Pygmalion, a poor cockney girl who is trained by a phonetics professor to pass as a duchess. Here is a discussion between ELIZA and a user that Weizenbaum recorded in 1965: Patient: “Men are all alike.” ELIZA: “IN WHAT WAY?” Patient: “They’re always bugging us about something or other.” ELIZA: “CAN YOU THINK OF A SPECIFIC EXAMPLE?” Patient: “Well, my boyfriend made me come here.” ELIZA: “YOUR BOYFRIEND MADE YOU COME HERE?” Patient: “He says | am depressed most of the time.” ELIZA: “I AM SORRY TO HEAR YOU ARE DEPRESSED.” Patient: “It’s true, I’m depressed most of the time.” ELIZA: “DO YOU THINK COMING HERE WILL HELP YOU TO NOT BE UNHAPPY?” Patient: “I need some help, that much seems certain.” Weizenbaum’s program quickly became famous around MIT. (The joke was that it was a better conversationalist than most of the awkward shoegazers studying at the Cambridge campus.) ELIZA gave users a chance to have a taste of a machine with something like a mind of its own. The program spread to other campuses, to different research institutions, and with every step of its success, Weizenbaum found himself more distressed. He summarized his worries in his 1972 masterpiece Computer Science and Human Reason. What troubled him was not ELIZA, but her users, the humans. Even when they were professional psychologists, many of them quickly considered it reasonable to think that one day the work of diagnosis and counseling might be turned over to machines. This felt like a natural next step in the ceaseless progress they were used to in their lives. Better refrigerators, stronger seat belts, faster jet planes, more plastic - why nota computer doing therapy? It 121

Technical Artifacts (1)

View in Artifacts Browser

Email addresses, URLs, phone numbers, and other technical indicators extracted from this document.

Wire Refrefrigerators

Forum Discussions

This document was digitized, indexed, and cross-referenced with 1,400+ persons in the Epstein files. 100% free, ad-free, and independent.

Annotations powered by Hypothesis. Select any text on this page to annotate or highlight it.