Searle


next up previous contents
Next: Dreyfus
Up: No Title
Previous: Grander criticisms - PenroseSearle and Dreyfus
Back: to main list of student notes

Searle

For his original paper, including the Chinese Room argument, see Minds, Brains and Programs, in Behavioural and Brain Science volume 3 1980. Reprinted in The Mind's I edited by Hofstadter and Dennett (PSY AA:H 067; Philosophy; Hooke). Also reprinted in Mind Design edited by Haugeland (PSY KH:H029).

The B and BS paper is followed by twenty-eight replies from Searle's critics, these are also worth reading. They have been omitted from The Mind's Eye and Mind Design, but the editors discuss a few in the Reflections section following the Mind's Eye reprint. Searle himself discusses a few of the possible replies in his original paper; apparently he put a first draft to several AI people before publishing the final version. There's a summary of his argument on pages 269-271 of Crevier (PSY KH:C 086).

Main argument: Assume, says Searle, that some A.I. researchers have devised a program that speaks Chinese. You ask it questions (in Chinese), and it replies with appropriate answers (also in Chinese), drawing information where necessary from some memory of imaginary events that its programmers have implanted. So if it had been given memories of dining in a Chinese restaurant, you might ask ``What did you think of last night's meal?'', and it might reply ``The crispy duck was fine, but I'm not very good with chopsticks, so I didn't get as much as I wanted.''

Now, this program is essentially a set of symbol-manipulating rules encoded in Prolog, Pascal, or whatever. Take these rules, and write them down on paper. Put them in a small lockable windowless room, together with a supply of paper, filing cabinets, pencils, and rubbers. To one outside wall of this room, affix a TV camera and scanner. This camera can be shown Chinese characters. It digitises them, and sends the results along a cable to a VDU inside the room. To another outside wall, affix a ticker-tape machine. It can print in Chinese, and is controlled by commands coming along another cable from a keyboard inside the room. Finally, find a person to do the clerical work - a graduate accountancy reject perhaps. This clerk is to be locked in the room; his job is to copy numbers off the internal screen, process them strictly according to the rules you've provided, and type out the results on the keyboard. They'll be sent to the ticker-tape machine, and printed out as Chinese.

You now have a person playing the part of your Chinese-speaking program. To an outsider, the combination of room, ticker-tape printer, and TV camera, behaves just like a native Chinese. But - a native Chinese feels that he understands the questions he's asked. The clerk feels no such thing. All he's doing is manipulating symbols that to him are meaningless. You may ask a question about woks and bean sprouts, or Beijing and student riots, but he doesn't know it's about these - all he knows is a string of meaningless symbols to be turned into other equally meaningless symbols. He is an example of a system that's working by formal symbol-manipulation; he doesn't understand the meaning behind them; hence no other formal ssymbol-manipulation system can either.

One reply is the systems reply which is functionalist in nature. What matters is the formal identity between the room and the brain. Understanding resides in the entirety of the system (the room plus the rules plus all the pencils and paper), not in one little part of it, the man. Searle tries to disguise this by making the rest of the system appear much smaller than it is; in reality, the ``room'', with its filing cabinets and rulebooks would cover the whole of the Earth's surface, and the man would play as insignificant a role as does one neuron in a brain. (See also the Churchland's reply: Crevier page 270.)

Searle's attitude to functionalism: formal identity of structure is not sufficient. The brain (or other thinking system) must also have the ``right causal powers''. I'm not clear on exactly what he means by this, but it seems to require that they be made out of the right kind of material. In the same way, a simulation of photosynthesis or of an ice-cream factory may have the right formal structure, but it still won't emit oxygen, or ice creams.


next up previous contents
Next: Dreyfus
Up: No Title
Previous: Grander criticisms - PenroseSearle and Dreyfus
Back: to main list of student notes



Jocelyn Ireson-Paine
Wed Feb 14 23:51:11 GMT 1996