• Email
  • Single Page
  • Print

How You Think

The Stuff of Thought is Steven Pinker’s fifth popular book in thirteen years, and by now we know what to expect. It is long, packed with information, clear, witty, attractively written, and generally persuasive. The topic, as earlier, is language and the mind—specifically, how language reflects human psychological nature. What can we learn about the mind by examining, with the help of linguistics and experimental psychology, the language we use to express ourselves?

Pinker ranges widely, from the verb system of English, to the idea of an innate language of thought, to metaphor, to naming, obscenity, and politeness. He is unfailingly engaging to read, with his aptly chosen cartoons, his amusing examples, and his bracing theoretical rigor. Yet there are signs of fatigue, not so much in the energy and enthusiasm he has put into the book as in the sometimes less than satisfying quality of the underlying ideas. I don’t blame the author for this: it is very hard to write anything deep, surprising, and true in psychology—especially when it comes to the most interesting aspects of our nature (such as our use of metaphor). A popular book on biology or physics will reliably deli-ver well-grounded information about things you don’t already know; in psychology the risk of banality dressed up as science is far greater. Sometimes in Pinker’s book the ratio of solid ideas to sparkling formulations is uncomfortably low (I found this particularly in the lively and amusing chapter on obscenity). He has decided to be ambitious, and there is no doubt of his ability to keep the show on the road, but it is possible to finish a long chapter of The Stuff of Thought and wonder what you have really learned—enjoyable as the experience of reading it may have been.

To my mind, by far the most interesting chapter of the book is the lengthy discussion of verbs—which may well appear the driest to some readers. Verbs are the linguistic keyhole to the mind’s secrets, it turns out. When children learn verbs they are confronted with a problem of induction: Can the syntactic rules that govern one verb be projected to another verb that has a similar meaning? Suppose you have already learned how to use the verb “load” in various syntactic combinations; you know that you can say both Hal loaded the wagon with hay and Hal loaded hay into the wagon. Linguists call the first kind of sentence a “container locative” and the second a “content locative,” because of the way they focus attention on certain aspects of the event reported—the wagon (container) or the hay (content), respectively (the word “locative” referring here to the way words express location). The two sentences seem very close in meaning, and the verb load slots naturally into the sentence frame surrounding it. So, can other verbs like fill and pour enter into the same combinations? The child learning English verbs might well suppose that they can, thus instantiating a rule of grammar that licenses certain syntactic transformations—to the effect that you can always rewrite a content locative as a container locative and vice versa. But if we look at how pour and fill actually work we quickly see that they violate any such rule. You can say John poured water into the glass (content locative) but you can’t say John poured the glass with water (container locative); whereas you can say John filled the glass with water (container locative) but you can’t say John filled water into the glass (content locative).

Somehow a child has to learn these syntactic facts about the verbs load, pour, and fill—and the rules governing them are very different. Why does one verb figure in one kind of construction but not in another? They all look like verbs that specify the movement of a type of stuff into a type of container, and yet they behave differently with respect to the syntactic structures in question. It’s puzzling.

The answer Pinker favors to this and similar puzzles is that the different verbs subtly vary in the way they construe the event they report: pour focuses on the type of movement that is involved in the transfer of the stuff, while neglecting the end result; fill by contrast specifies the final state and omits to say how that state precisely came about (and it might not have been by pouring). But load tells you both things: the type of movement and what it led to. Hence the verbs combine differently with constructions that focus on the state of the container and constructions that focus on the manner by which the container was affected.

The syntactic rules that control the verbs are thus sensitive to the precise meaning of the specific verb and how it depicts a certain event. And this means that someone who understands these verbs must tacitly grasp how this meaning plays out in the construction of sentences; thus the child has to pick up on just such subtle differences of meaning if she is to infer the right syntactic rule for the verb in question. Not consciously, of course; her brain must perform this work below the level of conscious awareness. She must implicitly analyze the verb—exposing its deep semantic structure. Moreover, these verbs form natural families, united by the way they conceive of actions—whether by their manner or by their end result. In the same class as pour, for example, we have dribble, drip, funnel, slosh, spill, and spoon.

This kind of example—and there is a considerable range of them—leads Pinker to a general hypothesis about the verb system of English (as well as other languages): the speaker must possess a language of thought that represents the world according to basic abstract categories like space, time, substance, and motion, and these categories constitute the meaning of the verb. When we use a particular verb in a sentence, we bring to bear this abstract system to “frame” reality in certain ways, thus imposing an optional grid on the flux of experience. We observe some liquid moving into a container and we describe it either as an act of pouring or as the state of being filled: a single event is construed in different ways, each reflecting the aspect we choose to focus on. None of this is conscious or explicit; indeed, it took linguists a long time to figure out why some verbs work one way and some another (Pinker credits the MIT linguists Malka Rappaport Hovav and Beth Levin). We are born with an implicit set of innate categories that organize events according to a kind of primitive physics, dealing with substance, motion, causality, and purpose, and we combine these to generate a meaning for a particular verb that we understand. The grammar of our language reflects this innate system of concepts.

As Pinker is aware, this is a very Kantian picture of human cognition. Kant regarded the mind as innately stocked with the basic concepts that make up Newtonian mechanics—though he didn’t reach that conclusion from a consideration of the syntax of verbs. And the view is not in itself terribly surprising: many philosophers have observed that the human conceptual scheme is essentially a matter of substances in space and time, causally interacting, moving and changing, obeying laws and subject to forces—with some of those substances being agents—i.e., conscious, acting human beings—with intentions and desires. What else might compose it? Here is a case where the conclusion reached by the dedicated psycholinguist is perhaps less revolutionary than he would like to think. The chief interest of Pinker’s discussion is the kind of evidence he adduces to justify such a hypothesis, rather than the hypothesis itself—evidence leading from syntax to cosmology, we might say. Of course the mind must stock basic concepts for the general structure of the universe if it is to grasp the nature of particular things within it; but it is still striking to learn that this intuitive physics shapes the very syntax of our language.

Not that everyone will agree with the general hypothesis itself—and Pinker has a whole chapter on innateness and the language of thought. Here he steers deftly between the extreme nativism of Jerry Fodor, according to which virtually every concept is innate, including trombone and opera (despite the fact that the concepts must therefore have preceded the invention of what they denote, being merely triggered into consciousness by experience of trombones and operas), and the kind of pragmatism that refuses to assign a fixed meaning to any word. Pinker sees that something conceptual has to be innate if language learning is to be possible at all, but he doesn’t believe it can be anything parochial and specific; so he concludes that only the most general categories of the world are present in the genes—the categories that any human being (or animal) needs to use if he or she is to survive at all. Among such categories, for example, are: event, thing, path, place, manner, acting, going, having, animate, rigid, flexible, past, present and future, causality, enabling and preventing, means and ends.

The picture then is that these innate abstract concepts mesh with the individual’s experience to yield the specific conceptual scheme that eventually flowers in the mind. The innate concepts pre-date language acquisition and make it possible; they are not the products of language. Thus Pinker rejects the doctrine of “linguistic determinism,” which holds that thought is nothing other than the result of the language we happen to speak—as in the infamous hypothesis of the linguists Benjamin Whorf and Harold Sapir that our thoughts are puppets of our words (as with the Eskimos who use many different words for snow). The point Pinker makes here—and it is a good one—is that we mustn’t mistake correlation for causation, assuming that because concepts and words go together the latter are the causes of the former. Indeed, it is far more plausible to suppose that our language is caused by our thoughts—that we can only introduce words for which we already have concepts. Words express concepts; they don’t create them.

Let’s suppose, then, that Pinker and others are right to credit the mind with an original system of basic physical concepts, supplemented with some concepts for number, agency, logic, and the like. We innately conceive of the world as containing what he calls “force dynamics”—substances moving through space, under forces, and impinging on other objects, changing their state. How do we get from this to the full panoply of human thought? How do we get to science, art, politics, economics, ethics, and so on? His answer is that we do it by judicious use of metaphor and the combinatorial power of language, as when words combine to produce the unlimited expressions of a human language. Language has infinite potential, because of its ability to combine words and phrases into sentences without limit: this is by now a well-worn point.

  • Email
  • Single Page
  • Print