CCM - Reading Assignment
The Incredible Eliza Effect and Its Dangers
Page 156
-
Surely the minimal prerequisite for us to feel comfortable in asserting that a computer made an analogy involving, say, water flow, is that the computer must know what water is - that it is a liquid, that it is wet and colorless...
Page 156-157
-
As a consequence of this lack of conceptual background, the computer is not really making an analogy. At best, it is constructing a correspondence between two sparse and meaningless data structures. Calling this "making an analogy between heat and water flow" simply because some of the alphanumeric strings inside those data structures have the same spelling as the English words "heat","water", and so on is an extremely loose and overly charitable way of characterizing what has happened.
Page 157
-
... computers - at least some of them - understand water and coffee and so on; computers understand the physical world; computers make analogies; computers reason abstractly; computers make scientific discoveries; computers are insightful cohabiters of the world with us. This type of illusion is generally known as the "Eliza Effect",...
Page 157
-
This type of illusion is generally known as the "Eliza Effect", which could be defined as the susceptibility of people to read far more understanding than is warranted into string symbols - especially words - strung together by computers.
Page 157-158
-
That infamous program's purpose was to act like a nondirective Rogerian psychotherapist, responding to the typed lamentations of patients with very bland questions that echoed their own words back to them, most of the time simply urging them to continue typing along the same lines ("Please go on"), and occasionally suggesting change of topics. The most superficial of syntactic tricks convinced some people who interacted with ELIZA that the program actually understood everything that they were saying, sympathized with them, and even empathized with them.
Page 158
-
Please understand that what I am saying is not meant as a criticism of the developers of SME, or even of Waldrop. It is meant as a critique of the whole mentality swirling around the complex intellectual endeavor called "AI" - a surprisingly unguarded mentality in which anthropomorphic characterizations of what computers do are accepted far too easily, both outside and within the field.
Page 161
-
The hard targets are the cases where even cognitive science professionals seem unable or unwilling to distinguish between what some program has done and what people do, provided there is some minimal degree of surface-level ressemblance. Claims that a program has read and understood a newspaper article about various economic experts' prognoses about interest rate...typify what I am talking about.
Page 161-162
-
In one chapter on artistic and literary and literary achievements by computers, for instance, she attributes almost uncanny powers of understanding to what is actually quite a simple program called "ACME", developed by psychologist Keith Holoyoak and philosopher Paul Thagard.
Page 165
-
Among the complex real-world analogies they cite it as having made are a political one (involving alleged terrorists in Nicaragua, Hungary, and Isreal), numerous scientific analogies(including the water-flow and heat-flow case), a number of "jealous animal stories", and some that link vastly different domains of knowledge. In fact, giving their program the ability ti make cross-domain analogies is clearly one of the achievements of which Holyoak and Thagard are most proud.
Page 167-168
-
... our domains are deliberately so stripped-down that the claims made cannot be very grandiose. In fact, thats exactly the problem... WHereas many research groups appear to be tackling domains of such complexity that even human experts are cowed - thermodynamics, international terrorism, atomic physics, computer-system configuration, VLSI chip design, economic forecasting, and on and on - we in our group are dealing with such microscopic domains that our programs' achievements appear to be nearly trivial.