Immodest proclamation justly accompanies great discovery; who would gainsay Archimedes shouting “Eureka” through the streets of Syracuse, or announcing that his lever would move the earth if only he could find a place to stand. More often than not, however, immodest proclamation is a cover-up, conscious or not, for failure. When conscious, the tactic can be stunning in its audacity: let us simply declare victory and get out, Senator Aiken declared in the best potential solution I ever heard for the morass of Vietnam. When unconscious, it is hollow.

Both titles of Lumsden and Wilson’s book—and its content—record unconscious failure. They have discovered, they claim, the Promethean fire of our evolution, the key to an understanding of both the origin and the subsequent history of the human mind. This key, they proclaim, is a “largely unknown evolutionary process we have called gene-culture coevolution: it is a complicated, fascinating interaction in which culture is generated and shaped by biological imperatives while biological traits are simultaneously altered by genetic evolution in response to cultural innovation.”

In responding to criticisms that human sociobiology, in its debut as the last chapter of Wilson’s Sociobiology (1975), ignored culture for a crude form of genetic determinism, Lumsden and Wilson have now discovered culture and use it as half of a positive feedback loop to explain, with genetics as the other half, all the essentials of our mental evolution. Lumsden and Wilson summarize their concept of gene-culture coevolution in the following way:

The main postulate is that certain unique and remarkable properties of the human mind result in a tight linkage between genetic evolution and cultural history. The human genes affect the way that the mind is formed—which stimuli are perceived and which missed, how information is processed, the kinds of memories most easily recalled, the emotions they are most likely to evoke, and so forth. The processes that create such effects are called the epigenetic rules. The rules are rooted in the particularities of human biology, and they influence the way culture is formed….

This translation from mind to culture is half of gene-culture co-evolution. The other half is the effect that culture has on the underlying genes. Certain epigenetic rules—that is, certain ways in which the mind develops or is most likely to develop—cause individuals to adopt cultural choices that enable them to survive and reproduce more successfully. Over many generations these rules, and also the genes prescribing them, tend to increase in the population. Hence, culture affects genetic evolution, just as the genes affect cultural evolution.

Promethean Fire is essentially a long argument that this unexceptional, and scarcely new, style of evolution can explain what may be the three most important aspects of our own history and current status (see page 84, for example):

  1. Gene-culture coevolution was the trigger for the historical origin of mind in human evolution. It propelled the evolution of increased brain size at a rate perhaps never exceeded for major events in the history of life.
  2. Many important universal aspects of human behavior have a genetic basis and set the epigenetic rules of mind that constrain culture.
  3. Differences among human cultures, though recent in origin and often deemed superficial, are not free of genetic influence and are usually shaped, or at least strongly influenced, by the efficient process of gene-culture coevolution.

Unfortunately for the exaggerated claims made by Lumsden and Wilson, the first point, while undoubtedly just, is scarcely original with them and has formed the core of speculations about the evolutionary origin of mind ever since Darwin; the second point, also uncontroversial, is trivial, at least for the examples now available; while the third, controversial and even revolutionary if it could be established, is almost surely false as a general, or even as a common phenomenon.

The Evolutionary Origin of Mind

Lumsden and Wilson begin their book by staking a claim for discovering the origin of mind:

What was the origin of mind, the essence of humankind? We will suggest that a very special form of evolution, the melding of genetic change with cultural history, both created the mind and drove the growth of the brain and the human intellect forward at a rate perhaps unprecedented for any organ in the history of life….

For the first time we also link research on gene-culture coevolution to other, primarily anatomical studies of human evolution, and use the combined information to reconstruct the actual steps of mental evolution.

The evolution of the human brain did indeed follow a peculiar pattern strongly implicating some phenomenon like gene-culture coevolution in its increase in size from ape to human level. When we first encounter our ancestors, the australopithecines, in Africa some three million to four million years ago, they had already undergone a major anatomical transformation to upright posture without a concomitant change in their brains, which remained at an ape’s characteristic size. Why did these two essential features of our evolution—upright walk and large brains—evolve in such a detached manner, and in this particular sequence? Why did the brain evolve later, after so much of essential human anatomy was already in place?

We have had empirical knowledge of this pattern since the 1920s, when australopithecines were first discovered in South Africa. But the theme of upright walk first, brains second had been correctly surmised, in a speculative way, by many thinkers about human evolution, in part by Darwin himself, but particularly—and with remarkable perspicacity—by his German champion Ernst Haeckel.

Advertisement

Lumsden and Wilson, in utter disregard of this history, stake their own claim for discovery: our brains enlarged and our minds took off only when we entered the positive feedback loop of their newly discovered process: gene-culture coevolution. The speed of our brain’s increase then records the accelerative power of positive feedback.

I don’t doubt that something like gene-culture coevolution was involved in the evolution of our brain. But then Darwin and Haeckel, and all other major thinkers about human evolution, have made the same argument. In fact, I don’t know that any serious theory other than gene-culture coevolution has ever been proposed to explain the sequence of upright posture first, brains later and quickly. The standard account argues that upright posture freed the hands for development of tools and weapons. This evolving culture of artifacts and their attendant institutions of hunting, food gathering, or whatever, then fed back upon our biological (genetic) evolution by setting selection pressures for an enlarged brain capable of advancing culture still further—in short, gene-culture coevolution.

Darwin put it this way in The Descent of Man (1871):

If some one man in a tribe, more sagacious than the others, invented a new snare or weapon, or other means of attack or defence, the plainest self-interest, without the assistance of much reasoning power, would prompt the other members to imitate him; and all would thus profit…. If the new invention were an important one, the tribe would increase in number, spread and supplant other tribes. In a tribe thus rendered more numerous there would always be a rather better chance of the birth of other superior and inventive members. If such men left children to inherit their mental superiority, the chance of the birth of still more ingenious members would be somewhat better, and in a very small tribe, decidedly better.

Ironically, for the man’s work is anathema to Wilson, who senses the evil influence of Marxism behind all radical criticism of his sociobiology, the best nineteenth-century case for gene-culture coevolution was probably made by Friedrich Engels in his remarkable essay of 1876 (posthumously published in the Dialectics of Nature), “The part played by labor in the transition from ape to man.”

Engels, following Haeckel’s outline as his guide, argues that upright posture must precede the brain’s enlargement because major mental improvement requires an impetus provided by evolving culture. Thus, freeing the hands for inventing tools first (“labor” in Engels’s committed terminology) came first, then selective pressures for articulate speech, since, with tools, “men in the making arrived at the point where they had something to say to one another,” and finally sufficient impetus for a notable (and genetically based) enlargement of the brain:

First labor, after it, and then with it, articulate speech—these were the two most essential stimuli under the influence of which the brain of the ape gradually changed into that of man.

An enlarging brain (biology, or genes in later parlance) then fed back upon tools and language (culture), improving them in turn and setting the basis for further growth of the brain—the positive feedback loop of gene-culture coevolution:

The reaction on labor and speech of the development of the brain and its attendant senses, of the increasing clarity of consciousness, power of abstraction and of judgment, gave an ever-renewed impulse to the further development of both labor and speech.

Those ignorant of history do, after all, repeat it—especially when there is virtually no other way to go.

Genetic Universals

In Lumsden and Wilson’s version of gene-culture coevolution, genetic predispositions common to all normal humans act in the positive feedback loop by setting epigenetic rules—or biases in learning—that constrain and channel culture. To choose their favorite example, avoidance of incest is a biological imperative of great importance, since the frequency of birth defects rises sharply with the closeness of relationship between marriage partners (and reaches a maximum for unions between siblings). Thus, any mechanism discouraging incest would be strongly favored by natural selection and should increase within populations.

Of course, genes are not conscious agents and cannot “tell” their bearers, “Don’t copulate with close relatives or you’re in for trouble.” But “epigenetic” or learning rules with a genetic base might be selected for such an effect. We might, for example, be predisposed by our biology not to develop sexual feelings toward those individuals reared with us in early childhood—familiarity breeds contempt, and all that. Since, in ancestral societies with limited mobility and tight kinship bonds, close proximity usually meant close relationship, the epigenetic rule produced its desired biological result. Of course, we could “fool” this rule today by separating siblings or, as some societies do (thus providing the shaky basis in evidence for the form of the rule itself) by raising nonrelatives together and gauging their later sexual aversion toward one another.

Advertisement

I am supposed to be a “nurturist” in the great “nature-nurture” debate, but I find nothing upsetting in this notion of biological influence upon human behavior. I suppose I must also emphasize once again, and for the umpteenth time as we all do, that the categories are absurd and that there is no “nature-nurture” debate as such, Margaret Mead and Derek Freeman and the pleasant alliteration of the phrase notwithstanding. Every scientist, indeed every intelligent person, knows that human social behavior is a complex and indivisible mix of biological and social influences. The issue is not whether nature or nurture determines human behavior, for they are truly inextricable, but the degree, intensity, and nature of the constraint exerted by biology upon the possible forms of social organization.

Thus the issue is not whether biological universals exist. Of course they do. We must sleep, eat, and grow older, and we are not about to give up procreation; almost every social institution we possess is influenced by these imperatives. Therefore, the simple listing of imperatives by Lumsden and Wilson, and the specification of the epigenetic rules they establish, is no defense for a “naturist” bias and no vindication of sociobiology. The issue, rather, is how shaping and constraining are the universals that can be specified. The answer, at least from the list they provide, is not very much at all. I therefore find this particular invocation of genetics as a newly discovered determinant of social behavior to be both trivial and uncontroversial.

Consider Lumsden and Wilson’s entire list of seven items: avoidance of brother-sister incest; learning of color vocabularies; preference of infants for objects of particular shapes and arrangements corresponding to the abstract form of a human face versus various scrambled patterns; the universality of certain facial expressions; preference of newborns for sugar over plain water and for sugars in descending order of sucrose, fructose, lactose, and glucose; anxiety of very young children in the presence of strangers; and phobias, particularly those that respond to ancient dangers (like snakes, running water, and thunderstorms) that no longer threaten us in our modern world.

Item two on color provides a particularly good illustration of why I maintain that these genetic universals offer no threat to what is often and mistakenly called the “nurturist” position—that human biology is rarely sufficiently constraining to determine human culture directly and that biology usually permits a wide and flexible range of different cultural possibilities. (The two positions should be called biological determinism and biological potentiality, not naturist and nurturist. We might instead refer to determinists and potentialists.)

A series of fascinating studies have shown that, although light comes to us in a continuously varying spectrum of wavelengths, people in all cultures tend to parse it into four basic colors: blue, yellow, red, and green. “This beautiful illusion,” Lumsden and Wilson write, “is genetically programmed into the visual apparatus and brain.” In part, we already know the physiological basis of this bias in learning. Visually active nerve cells in the lateral geniculate body of the thalamus, an important “relay station” between our eyes and the brain’s visual cortex, are divided into four types and probably code the light we see according to these four major colors. Now why should any potentialist (or even an old-fashioned, caricatured, exaggerated, nonexistent, tabula rasa nurturist) feel threatened by such a discovery, which I find fascinating? Brain and eye are physical objects with complex properties, not neutral filters. Why, to ensure a potentialist position, should they, by some a priori fiat, have to record color exactly as it comes to us from physics without imposing some constraint evolved from their own biological structure?

In a previous work, On Human Nature, Wilson devised an apt metaphor to characterize the issue between determinists and potentialists: the genes hold culture on a leash and the debate centers upon the length and tightness of that leash. Nor is the matter merely quantitative, with both sides sharing the same assumptions and differing only in their guesses about one or ten feet in the continuum of leash lengths; for a culture held on a taut one-foot leash (and therefore determined in its manifest properties directly by biology) is a qualitatively and fundamentally different thing from one suspended by a loose cord that cannot specify a particular institution but only a broad range of possibilities. That we see a spectral continuum as four colors, or that babies prefer sugar to water and human faces to scrambled designs, does not seem to dictate cultural patterns with great specificity. Thus the character of the biological universals that we can identify(and we have no reason to think that further research will alter the form of example, though it will obviously augment the list) suggests that the leash is loose and nonconstraining, though well worth our continued examination. The study of these universals, the one aspect of human sociobiology that does have some direct evidence going for it, therefore offers no solace for the determinist bias that is the soul of sociobiology, and the essence of its claim to be a new and revolutionary science.

Differences Among Human Cultures

To establish its importance and to fulfill its essentially reductionist research program (see below), human sociobiology cannot rest upon nonconstraining universals. It must demonstrate that differences among cultures, and historical change within cultures, the sources of our most interesting arguments about human nature (are the Chinese really…?), are also genetically driven. The crux of the determinist-potentialist debate lies squarely here, for genetics (and therefore Darwinian biology strictly conceived) will have little to say about human cultural diversity and change if it only underlies some universal patterns and exerts little constraint upon the incredible richness of detail that so fascinates us and is the subject matter of the social sciences. Lumsden and Wilson know that they must deliver here, or the revolution of human sociobiology dies aborning. They also know that no concrete information supports their hope. Consequently, they devoted their last, technical, book (Genes, Mind, Culture), of which Promethean Fire is a popularized abstract, to constructing a mathematical model showing that it could happen in principle under a set of dubious assumptions (see below). Their models fared poorly under critical review1 and Promethean Fire relies wholly on verbal appeals to plausibility in this crucial matter.

The problem that Lumsden and Wilson face is a deep one indeed. Adjacent cultures differ sharply in their basic beliefs and institutions; historical change within cultures may drive them from heights of power to depths of impotence on a time scale of hundreds of years, or even generations (consider the history of Islam from its days of glory late in our first millennium, to its days of despair during the height of Western colonialism, to its current rebirth). How, given this potential (and often realized) speed and depth of change, can we invoke the slow process of Darwinian natural selection as an important contributing influence?

Lumsden and Wilson’s answer, of course, is the accelerative force of positive feedback operating in gene-culture coevolution. With culture as a boost, genetics can potentially play a role in this short a time. But abstract models are one thing, and their realization is another. All manner of implausibilities can be modeled when assumptions are chosen to guarantee the result. It seems unlikely that biological change can play an important role in cultural diversity when we know that the nongenetic forces of climate, conquest, and the invention and spread of new technologies exert so controlling an influence. History rather than genetics must be the ground for our search to understand cultural diversity and change.

Where evidence does exist, Lumsden and Wilson must admit that it does not well suit their hopes. They write, for example:

Even the caste system of India, which is the most rigid and elaborate on Earth and has persisted for two thousand years, is maintained largely by cultural conventions. So far as is known (although the matter has never been thoroughly studied), members of different castes differ from one another only slightly in blood type and other measurable anatomical and physiological traits.

Lumsden and Wilson end their chapter on rules of mental development with these words:

The theory of human nature that prevails in the end will be the one that aligns social behavior and history with all that is known about human biology. It will correctly and uniquely characterize the known operations of the human mind and the patterns of cultural diversity. That is the grail towards which many scholars toil, despite frustration and not infrequent humiliation. At issue are the very limits of the natural sciences. Is the quest a fool’s errand?

I find nothing in this somewhat grandiloquent statement to offend any potentialist, though I have personal doubts about “correct and unique” characterization, given the vagaries of history. The quest is no fool’s errand and we all seek to gain the final understanding of Parsifal. We simply differ in our views about the relative importance of biology in this coming alignment of social behavior, history, and genetics. I suspect that biology will not have an important part in explaining “patterns of cultural diversity”; to that extent, the natural sciences do meet their limit. But why should this be sad or disappointing? It does not mean that knowledge must remain fragmentary; it simply holds that the correct empirical equation will grant a large coefficient to history and a very small one to genetics.

The Assumptions of Sociobiology

In the absence of evidence for their most important claim, and in the face of some information and several strong prima facie arguments against it, we must ask what infuses Lumsden and Wilson with such confidence about the promise of sociobiology as a key to explaining human cultural diversity. Here we encounter the methodological premises of the sociobiological research program in Wilson’s version. Sociobiology has engendered a great deal of debate in several disparate fields since Wilson published his book of the same name in 1975. Political discussion about the uses of biological determinism have often masked the deeper methodological objections that call the whole enterprise into question, whatever its implications. (I have pursued the political debate myself, and certainly do not abjure it, but we must recognize that a more fundamental criticism questions the very style of argument as an appropriate application of evolutionary theory.)

To substitute biology for history in the absence of evidence requires an a priori faith that genetic explanations are, in some ultimate sense, preferable. Such a position emerges from the old-fashioned reductionism espoused with a vengeance by Lumsden and Wilson. A hierarchy of sciences runs from hard to soft, quantitative to qualitative, firm to squishy, from physics through biology to the loose domain of social sciences. Any time we can jack an explanation up from the realm of a soft science to a harder one, this is intrinsically a good thing. Genetics really is better than history as a scientific explanation. At times, Lumsden and Wilson’s reductionism can become downright militant:

The bridge between biology and psychology is still something of an article of faith, in the process of being redeemed by neurobiology and the brain sciences. Connections beyond, to the social sciences, are being resisted as resolutely as ever. The newest villain of the piece, the embattled spearhead of the natural-science advance, is sociobiology.

I have attacked this style of reductionism recently in these pages2 and will not rehearse the arguments here—except to say that if no important genetic differences underlie cultural diversity and change, one’s view of intrinsic preference for a style of argument must bow to the constraints of information.

In a classic error of reductionism, Lumsden and Wilson write:

To many of the wisest of contemporary scholars, the mind and culture still seem so elusive as to defeat evolutionary theory and perhaps even to transcend biology. This pessimism is understandable but, we believe, can no longer be justified. The mind and culture are living phenomena like any other, sprung from genetics, and their phylogeny can be traced.

But historical origin and current function are different things. Of course the mind is sprung from genetics (or at least involved fundamental genetic changes in the evolution of brain), but such a statement about history does not guarantee a biological basis for current cultural diversity since the springboard, once installed, may set a common genetic level, while cultural diversity then develops as a historical overlay. I am also amused by the giveaway admission of reductionist bias—the claim that any denial of evolutionary theory as a proper locus for the explanation of culture must involve pessimism. After all, the “wise scholars” of this statement are not intoning “ignorabimus” (we shall never know), but merely claiming that culture will achieve its primary explanation from disciplines other than biology.

Another aspect of reductionism underlies Lumsden and Wilson’s application of sociobiology to culture and helps to explain why so many evolutionary biologists, who have no political feelings about the matter and who (as professionals) share Wilson’s hope for the advancing hegemony of biology, reject the specific mode of argument used in this book. Wilson is our leading student of the behavior of social insects, those marvelous creatures of little brain whose behavior can be atomized into a set of individual traits, each treated independently. Ants behave, in many essential respects, as automata, but human beings do not and the same methods of study will not suffice. We cannot usefully reduce the human behavioral repertoire to a series of unitary traits and hope to build it up again by analyzing the adaptive purpose of each individual item.

First of all, we can’t come close to agreement on a proper atomization, probably because behavioral wholes are not simple aggregates of any set of small parts (is “xenophobia” really a “thing” and is it the proper category for analyzing both a young infant’s aversion to strangers and an adult’s professed hatred of other races?). Second, and perhaps more important, we have no reason to believe that each item maintains its particular form (or set of alternative states) as an adaptation engendered by natural selection. Yet Lumsden and Wilson are committed to a strict version of Darwinism that equates genetic change with adaptation and therefore must analyze all important behavioral differences according to their advantages in the varying environments occupied (or once occupied when the trait arose) by the cultures under analysis. Thus, for example, religion, despite all its complexities, is labeled “a powerful device by which people are absorbed into a tribe and psychically strengthened.”

Lumsden and Wilson have even constructed a terminology for their atomization. They propose to call each unit of human behavior a “culturgen,” derived from the Latin for “creating culture,” but clearly recalling both elements of the positive feedback loop of gene and culture. Their analysis then traces the spread of each culturgen, considered independently, through populations by way of the adaptive force of natural selection. Consider these words on the supposedly self-evident nature of culturgens. (Even these examples are dubious reifications, except perhaps for the third, and they are chosen as best illustrative cases. Imagine what happens when we come to more culture-laden concepts like stereotyping and sexual behavior.)

Many culturgens are naturally distinct and would stand with or without theory. The preference for incest, for example, exists as a clear alternative to the preference for outbreeding. Women tend to carry infants on their left side close to the heart, a practice easily distinguished from other modes of infant transport. To raise the eyebrow in greeting is a gesture distinct from other facial signals.

Lumsden and Wilson then confuse historical development with their preferred methodology by claiming that since genetics began with Mendel’s single traits, the study of human behavior must also start with unitary culturgens and build up the entire repertoire sequentially: “By detecting and analyzing large numbers of such single-gene differences, great and small, a picture of the full genetic blueprint can gradually be assembled. This is the way genetics has proceeded from the garden plots of Mendel to the mighty enterprise it is today.”

The problem of adaptation is probably even greater than this dilemma of atomization because a strong argument can be advanced for asserting that most “things” done today by the brain could not have evolved originally as direct adaptations connected with its evolutionary increase in size.

The debate is an old one and goes back to a bitter disagreement between Darwin and the codiscoverer of natural selection, Alfred Russel Wallace. Wallace, a true pan-selectionist in the Wilsonian manner (as Darwin was not), advanced the curious argument that natural selection could account directly for every trait in the evolution of all organisms except for the human brain, which required a divine assist. He was accused of lacking courage, of failing to extend his system to the final and most important step of all. I cannot analyze the psychology of his reluctance; but, ironically, the logic of his argument springs not from his religion or spiritualism, but from his pan-selectionism itself.

Wallace, in an uncommon attitude for nineteenth-century white scientists, was a nonracist who truly believed in the equal mental capacities of all people. But he was a cultural chauvinist who did not doubt the overwhelming superiority of Western European institutions. Now, if natural selection constructs organs for immediate use and if brains of all people are equal, how could natural selection have built the original “savage’s” brain (his terminology)? After all, savages have capacities equal to ours, but they do not use them in devising their cultures. Therefore, natural selection, which constructs only for immediate utility, cannot have fashioned the human brain.

Darwin was flabbergasted. He wrote to Wallace: “I hope you have not murdered too completely your own and my child.” His simple counterargument, born from his pluralistic attitude toward the variable power of natural selection, held (if I may put it in modern terminology): the brain is a very complex computer. I have no doubt that natural selection was the cause of its increase in size and mental power. Selection probably built our large brain for a complex series of reasons, now imperfectly understood. But whatever the immediate reasons, the enlarged brain could perform (as a consequence of its improved structure) all manner of operations bearing no direct relation to the original impetus for its increase in size. I may put a computer in my factory only to issue paychecks and keep accounts, but the device can (as a consequence of its structure) also calculate pi to 10,000 places and perform a factor analysis on the correlation matrix of human culturgens.

Historical origin and current function are different properties of biological traits. This a general principle in evolutionary theory. Features evolved for one reason can always, by virtue of their structure, perform other functions as well. Often the principle is of minor importance, for the directly selected function may overwhelm any side consequence. But the situation must be reversed for the brain. Here, surely, the side consequences must overwhelm the original reasons—for there are so vastly more consequences (surely by orders of magnitude) than original purposes.

Consider only, for example, our knowledge of personal mortality. Nothing that our large brain has allowed us to learn has proved more frightening and weighty in import. I doubt anyone would argue that our brains increased in order to teach us this unpleasant truth. Yet consider the impact of this knowledge upon a diverse range of human institutions, from religion to kingship and its divine right. The specific forms of religion need not be seen as direct adaptations “for” tribal cohesion; much of culture may arise as responses to the curious and unpredictable side consequences of a large brain. Thus, and particularly for the human brain (that key organ of human sociobiology), the adaptive analysis of culturgens is probably an inappropriate methodology.

The message of this assertion is not, as adaptationists often charge, the pessimistic statement that reasons for concepts and institutions of culture must therefore always remain elusive. We must simply shift our focus from guesses about adaptation to direct study of structure. Several years ago, Francis Crick said to me, after a talk I had given at the Salk Institute, “The trouble with you evolutionary biologists is that you are always asking ‘why’ before you understand ‘how.”‘ Ten years before, I would have dismissed this comment as the misunderstanding of a reductionist molecular biologist who simply didn’t comprehend the distinctive character of evolutionary reasoning. But now I understand and agree with Crick—for we have to know more about the “hows” of structure before we can even judge properly whether or not there is a direct “why” to seek.

Despite these fatal flaws of atomization and adaptationism, Lumsden and Wilson remain confident of their future as sociobiologists. They write:

Human sociobiology is in approximately the same position as molecular biology in its earliest days. That is, several key mechanisms have been identified, enough to explain elementary phenomena in a new and more precise way. The subject is still rudimentary, but if both biology and culture are to be taken into account, it seems the only way to go.

Of course, the success of one field after a painfully slow beginning does not guarantee that all disciplines in a similar state have a rosy future; as with species and businesses, most never get very far. I think that Lumsden and Wilson have missed the crucial difference between molecular biology and human sociobiology—though they regard them as similar methodologically, hence the cause of their dilemma. For molecular biology, the reductionistic research program really did work, triumphantly (though it has now reached its limits in considering certain aspects of the cohesion of entire “genomes” or complete sets of chromosomes). After all, molecular biology is, to a large extent, chemistry. But the same reductionistic strategy will not work for human culture. When we talk of falling in love as a form of chemistry, we speak only in metaphor. And that is a profound difference, both for poets and scientists.

This Issue

June 30, 1983