[Next] [Up] [Previous]
Next: Artefacts Up: AI and the Philosophy Previous: Searle's Chinese Room

Responding to Searle

One possible response to Searle is to try to insist that the operator in the Chinese Room does understand Chinese after all, despite the operator's most insistent assurances to the contrary. Suppose we were playing chess with someone who, as it turned out, had no conscious knowledge of the rules of chess, but was operating under post-hypnotic suggestion. Would there not, perhaps, be a case for saying that this person did know how to play chess, even though that knowledge was not possessed by the player in a standard fashion? It has, after all, been known for people under post-hypnotic suggestion to deny most vehemently that they have a certain ability or motivational pattern, and then to exhibit that ability or motivation. Might our Chinese Room operator not be in a similar position? We could perhaps say that the operator understood Chinese, but that the operator's understanding was incorporated in a non-standard set of operational dispositions. The trouble with this reply, however, is that it is self-defeating. The aim of the original Turing Test simulation was to show how human psychological characteristics could be embodied in a digital computer. What we wanted to be able to conclude was that the computer could understand Chinese, in the straightforward fashion in which normal Chinese speakers understood the language. But if all we could conclude concerning the human simulator of the program was that the simulator understood Chinese in this non-standard way, then surely we would only have the right to ascribe, at best, this non-standard form of Chinese understanding to the computer -- a hollow victory.

One popular response to Searle's argument -- dubbed by him the `Systems Response' -- has been to concede that the human symbol operator inside the Chinese Room does not understand Chinese, but to claim that the operator is merely part of a larger system, and that it was `the system as a whole' which understood Chinese. One version of this response was made by Dennett, in conjunction with Douglas Hofstadter, in their joint compilation The Mind's I (Hofstadter and Dennet, 1981, pp.\ 373-382). The nub of their argument is that Searle has failed to take account of the different levels of a computational system. It may be correct to say, of one level of the system, that it is operating merely upon the formal or syntactic properties of the symbols that it is manipulating. But it may also be appropriate to apply a more full-bloodedly intentional or semantic description of its operations when considering the system at a higher level. Bearing that in mind, the total computational system of which the human symbol operator is just a part, may possess higher-level psychological or cognitive properties that are not attributable directly to the symbol operator in the operator's own right. This, at least, is something which Searle would have to show independently to be false, before his argument could be accepted without question.

Searle dismisses the `Systems Response' in forthright terms, but it is not clear that he has very much in the way of sound argument, rather than rhetoric, with which to reject it (this is not to say that the Systems Response is correct). One consideration he brings to bear is this: what could the `system' be other than the person, the various pieces of paper which are being shuffled about inside the Chinese Room, and the room itself? If the symbol operator does not understand Chinese, how can the symbol operator plus bits of paper plus room understand Chinese? But this is to assume that the only way you can characterize a system is in terms of the physical entities which make the system up, rather than in terms of the functional relationships between those entities. After all, what is a human being other than a whole load of carbon and oxygen molecules, and a few other odds and ends? Described in such terms, it would seem pretty unconvincing to suppose that a human being could understand things, or indeed have any other mental states! The point at issue is just that the person in the room, operating the symbols in the appropriate ways, produces a functional system whose properties cannot be described purely in terms of the physical constituents in their own right. Searle may dislike the claim that there is such a functional system, but he cannot just dismiss the idea without begging the very question at issue.


[Next] [Up] [Previous]
Next: Artefacts Up: AI and the Philosophy Previous: Searle's Chinese Room

Cogsweb Project: luisgh@cogs.susx.ac.uk