• Email
  • Single Page
  • Print

The Mind Wins!

The Rediscovery of the Mind

by John Searle
MIT Press, 270 pp., $22.50

According to a widely held view, the brain is a giant computer and the relation of the human mind to the human brain is like that of a computer program to the electronic hardware on which it runs. The philosopher John Searle, a dragon-slayer by temperament, has set out to show that this claim, together with the materialist tradition underlying it, is nonsense, for reasons some of which are obvious and some more subtle. Elaborating arguments that he and others have made over the past twenty years, he attacks most of the cognitive science establishment and then offers a theory of his own about the nature of mind and its relation to the physical world. If this pungent book is right, the computer model of the mind is not just doubtful or imperfect, but totally and glaringly absurd.

His main reasons are two. First, the essence of the mind is consciousness: all mental phenomena are either actually or potentially conscious. And none of the familiar materialist analyses of mind can deal with conscious experience: they leave it out, either by not talking about it or by identifying it with something else that has nothing to do with consciousness. Second, computers which do not have minds can be described as running programs, processing information, manipulating symbols, answering questions, and so on only because they are so constructed that people, who do have minds, can interpret their physical operations in those ways. To ascribe a computer program to the brain implies a mind that can interpret what the brain does; so the idea of explaining the mind in terms of such a program is incoherent.

1.

Searle’s book begins with a lucid critical survey of the different views now circulating about the relation of the mind to the body. The mind-body problem was posed in its modern form only in the seventeenth century, with the emergence of the scientific conception of the physical world on which we are now all brought up. According to that conception, the physical world is in itself colorless, odorless, and tasteless, and can be described mathematically by laws governing the behavior of particles and fields of force in space and time. Certain physical phenomena cause us to have perceptual experience—we see color and hear sound—but the qualities we experience do not belong to the light and sound waves described by physics. We get at the physical reality by “peeling off” the subjective effects on our senses and the way things appear from a human point of view, consigning those to the mind, and trying to construct an objective theory of the world outside our minds that will systematically explain the experimental observations and measurements on which all scrupulous observers agree. However radically the content of contemporary physics and its conception of the role of the observer may differ from that of classical physics, it is still in search of a theory of the external world in this sense.

But having produced such a conception by removing the appearances from the physical world and lodging them in the mind, science is faced with the problem of how to complete the picture by finding a place in the world for our minds themselves, with their perceptual experiences, thoughts, desires, scientific theory-construction, and much else that is not described by physics. The reason this is called the mind-body problem is that what goes on in our minds evidently depends on what happens to and in our bodies, especially our brains; yet our bodies are part of the “external” world—i.e., the world external to our minds—which physical science describes. Our bodies are elaborate physical structures built of molecules, and physics and chemistry would presumably give the most accurate description of everything they do or undergo.

Descartes famously thought that if you considered carefully the nature of outer physical reality and the nature of inner mental reality (as exemplified by your own mind), you could not help seeing that these had to be two different kinds of things, however closely they might be bound together: a mind and its thoughts and experiences just couldn’t be constructed out of physical parts like molecules in the way that the heart or the brain evidently can be. Descartes’s conclusion that mental life goes on in a nonphysical entity, the soul, is known as dualism—sometimes “substance” dualism, to distinguish it from “property” dualism, which is the view that though there is no soul distinct from the body, mental phenomena (like tasting salt or feeling thirsty) involve properties of the person or his brain that are not physical.

The power of Descartes’s intuitive argument is considerable, but dualism of either kind is now a rare view among philosophers,1 most of whom accept some kind of materialism. They believe that everything there is and everything that happens in the world must be capable of description by physical science. Moreover they find direct evidence that this can be done even for the mind in the intimate dependence of mental on neurophysiological processes, about which much has been learned since the seventeenth century. And they find indirect evidence, from the remarkable success of the application of physics and chemistry to other aspects of life, from digestion to heredity. Consequently most efforts to complete the scientific world view in a materialist form have proceeded by some sort of reduction of the mental to the physical—where the physical, by definition, is that which can be described in nonmental terms.

A reduction is the analysis of something identified at one level of description in the terms of another, more fundamental level of description—allowing us to say that the first really is nothing but the second: water can be described as consisting of H2O molecules, heat as molecular motion, light as electromagnetic radiation. These are reductions of the macroscopic physical to the microscopic physical, and they have the following noteworthy features: 1) They provide not just external information about the causes or conditions of the reduced phenomenon, but an internal account of what water, heat, and light really are. 2) They work only because we have distinguished the perceptual appearances of the macroscopic phenomena—the way water and heat feel, the way light looks—from the properties that are being reduced. When we say heat consists of molecular motion, we mean that heat as an intrinsic property of hot objects is nothing but the motion of their molecules. Such objects produce the feeling of heat in us when we touch them, but we have expressly not identified that feeling with molecular motion—indeed the reduction depends on our having left it out.

Now how could mental phenomena be reduced to something described entirely in physical, nonmental terms? In this case, obviously, we cannot leave out all effects on the mind, since that is precisely what is to be reduced. What is needed to complete the materialist world picture is some scheme of the form, “Mental phenomena—thoughts, feelings, sensations, desires, perceptions, etc.—are nothing but…,” where the blank is to be filled in by a description that is either explicitly physical or uses only terms that can apply to what is entirely physical.2 The various attempts to carry out this apparently impossible task, and the arguments to show that they have failed, make up the history of the philosophy of mind during the past fifty years.

Searle’s account of that history begins with behaviorism, the view that mental concepts do not refer to anything inside us and that each type of mental state can be identified with a disposition of the organism to behave observably in certain ways under certain physical conditions. When this view began to look too much like a bald denial of the existence of the mind, some philosophers put forward identity theories, according to which mental processes are identical with brain processes in the same way that light is identical with electromagnetic radiation. But identity theories were left with the problem of explaining in nonmental terms what it means to say of a particular brain process that it is a thought or a sensation. After all, this can’t mean only that it is a certain kind of neurophysiological process. And given the aim of these theories, it couldn’t mean that the brain process has some mental effect. The proposed solution was a revival of behaviorism in a new form: Thirst, for example, was identified not with a disposition to drink, but with a brain state; but that particular brain state’s being identical with thirst was now said to consist simply in the fact that it was typically caused by dehydration and that it typically caused a disposition to drink. In this way it was thought that the identification of mental states with brain states could avoid all reference to non-physical features.

These “causal behaviorist” analyses were eventually succeeded by a more technical theory called functionalism, according to which mental concepts cannot be linked to behavior and circumstances individually but only as part of a single interconnected network. The behavior caused by thirst, for example, depends on the rest of a person’s mental condition—his beliefs about where water is to be found and whether it is safe to drink, the strength of his desires to live or die, and so forth. Each mental state is a part of an integrated system which controls the organism’s interaction with its environment; it is only by analyzing the role played by such states as thirst, pain, other kinds of sensation, belief, emotion, and desire, within the total system, that one can accurately describe their connection to behavior and external circumstances. Such a system may still permit mental states to be identified with brain states, provided the latter have causal or functional roles of the kind specified by the theory (still to be constructed) of how the integrated system works. Finally, functionalism led to what Searle calls Strong AI (Strong Artificial Intelligence)—the identification of mental states with computational states of a computer program which controls the organism’s behavior—a program which is physically realized in the hardware (or wetware) of the brain.3

All these theories attempt to reduce the mind to one or another aspect of a world that can be fully described by physics—the world of particles and fields. They have not been worked out in detail; they are just hopeful descriptions of the kind of thing a theory of the mind would have to be, together with some extremely sketchy examples. While each new proposal has been criticized by defenders of alternative reductionist accounts, Searle argues that there is one big thing wrong with all of them: they leave out consciousness.

2.

No theory that leaves out consciousness can claim to be a theory of the mind, and no analysis of consciousness in nonmental terms is possible; therefore no materialistic reduction of the mental can succeed. Searle contends that none of these theories could possibly provide an account of what pain, hunger, belief, vision, etc. really are, because all they talk about is what is externally observable—the organism’s behavior and its causal structure—and a description of something exclusively in those terms fails to guarantee that it has any consciousness at all: each of these behaviorist, functionalist, or computational theories could be satisfied by an unconscious zombie of sufficient physical complexity.

  1. 1

    But see Geoffrey Madell, Mind and Materialism (Edinburgh University Press, 1988).

  2. 2

    Another reductionist strategy, which I haven’t the space to discuss here, is to substitute for a theory of what mental states are a theory of the externally observable grounds on which we ascribe mental states to people, and to claim that this system of “assertability conditions” is all the analysis the concepts need. One doesn’t identify mental phenomena with anything physical, because one doesn’t identify them with anything. But the conditions of applicability of mental concepts are, on this view, compatible with the world’s being nothing but a material system. This is essentially Daniel Dennett’s strategy in Consciousness Explained (Little, Brown, 1991)

  3. 3

    Behaviorism is more or less represented by Gilbert Ryle, identity theory by J.J.C. Smart, and functionalism by Hilary Putnam—but there are many writers and the literature is very large. See one of the excellent recent collections on the philosophy of mind: Ned Block, editor, Readings in Philosophy of Psychology, two volumes (Harvard University Press, 1980); W.G. Lycan, editor, Mind and Cognition (Black-well, 1990); David Rosenthal, editor, The Nature of Mind (Oxford University Press, 1991). Putnam has now abandoned functionalism; see Representation and Reality (MIT Press, 1988).

  • Email
  • Single Page
  • Print