At its founding, in 1807, the Geological Society of London vowed in its charter to eschew the older speculative tradition of grandiose “theories of the earth” and to concentrate instead on the collection of stratigraphic facts in order to build a geological time scale, literally stone by stone. This strategy proved brilliantly successful: by mid-century, a worldwide sequence had been established as an alphabet and foundation for our planet’s history. Nonetheless, this extreme position on the dialectic between fact and theory in science provoked a legitimate reaction from thoughtful scholars who recognized the ultimate sterility of a methodology without context and without aim beyond the noble, but unattainable, goal of piling pristine fact upon fact in the hope that generality would somehow mysteriously emerge at the end.

One of Charles Darwin’s most famous statements, made in 1861 at the height of his productivity, is a comment upon this overly empirical tradition in geology (though this context has rarely been appreciated by citationists). He wrote in a letter to Henry Fawcett:

About 30 years ago there was much talk that geologists ought only to observe and not theorize; and I well remember someone saying that at this rate a man might as well go into a gravel-pit and count the pebbles and describe the colors. How odd it is that anyone should not see that all observation must be for or against some view if it is to be of any service.

Darwin’s view on the need for theory both to suggest and to coordinate observations has been widely acknowledged by scientists as both desirable and inevitable (despite the semiofficial persistence of a public myth about absolutely objective impartiality). This interplay of theory and empirical documentation has both positive and negative implications for the elusive notion of scientific “progress.” Theory can prod, suggest, integrate, and direct in fruitful ways; I doubt that Darwin would ever have been able to formulate the theme of natural selection without the available context of Adam Smith’s nearly identical causal system for economics (Darwin, in any case, surely did not “see” natural selection in the finches and tortoises of the Galapagos). But theory can also stifle, mislead, and restrict; conceptual locks are usually more important than factual lacks as impediments to scientific breakthroughs.

I am an outsider, though a near neighbor, to the subject matter of Donald O. Henry’s From Foraging to Agriculture, a largely archaeological study of late and immediately post-ice-age cultures in the Levant—a particularly important time and place featuring an instance of the key event in human civilization: the origin of agriculture (defined as the cultivation of plants for food). I cannot judge the empirical validity of Henry’s arguments; but working as I do in the collateral field of paleontology and evolutionary theory, a domain that employs a similar conceptual apparatus (with intriguing differences), I became fascinated with the hold of theory and fashion upon the interpretations here offered. The view from the next town may give enough distance for independent evaluation, yet provide sufficient proximity for understanding shared concerns. The double-edged sword of a conceptual scheme as both liberator and incarcerator has never been more clear to me.

The persistence of a gradualist and progressivist iconography, inherent in the pictorial metaphor of the “march from ape to man” (thus folding biases of gender into the broader conceptual lock), makes it difficult for us to appreciate concepts of long stability and structural breakpoints of disruption—key themes of a more adequate context for grasping the crucial role of agriculture in human civilization. The argument for the centrality of agriculture is venerable and probably basically correct. The pre-agricultural systems of foraging, hunting and gathering usually involve impermanent settlement (for resources must be tracked and followed). The size of populations stabilizes at relatively low levels of an environment’s “carrying capacity” (to use the ecological jargon). Resources are not stockpiled, wealth, therefore, does not accumulate; social classes and permanent differences of status, based on the commandeering of such surpluses, do not arise to any marked degree.

Agriculture, or so the venerable argument goes, permits people to accumulate general surpluses for the first time (though we all appreciate that seven lean years often follow seven good seasons); it elicits social stratification, and encourages permanent settlements, including the establishment of the civis (citizen) in the civitas (community), hence the origin of our word civilization.

Agriculture has been appreciated as central in this sense but still, following the old iconography, merely as a stage in the predictably progressive advance both of cultural complexity and in the underlying biological evolution of mental improvement that must serve as a substrate. But consider a different viewpoint, embedded in its own conceptual rigidities no doubt, but still salutary as a prod to a different way of thinking, and as more consonant with some crucial information, recently obtained.


The human lineage, dating from the split of a common ancestor into branches for African great apes and for us, dates back some six to eight million years as estimated from molecular similarities and inferred rates of change. But the origin of our own species, Homo sapiens, by the same criteria may extend back only some 250,000 years or so. We have no direct evidence about the physical appearance of these earliest Homo sapiens, but by the time we meet our first European forebears, in the classic Cro-Magnon faunas some 20,000 to 40,000 years old, they are us. The people who painted the caves of Lascaux and Altamira, who carved the Venus figurines and the reindeer bas reliefs, are physically indistinguishable from modern humans. Agriculture, with its cascade of effects leading to all that we call “civilization,” occurred without any recognizable change in either the brains or the bodies of those undergoing the transition.

Moreover, agriculture is no quirk of one particularly fortunate or enlightened group (though one might say benighted or imperiled, given the consequences); it arose several times, and apparently independently, not only in the Levant, but also in the Americas and in the Far East. What prompted its origin in the absence of any trigger based on evolutionary change? And why are the separate origins so closely spaced, if they were uncoordinated and not provoked by biological changes in our evolution? Does agriculture have some cultural necessity or predictability (unlikely since most cultures never took the step); or does the power of its result simply overwhelm other peoples when a culture, always mentally capable, wanders into this solution from which, structurally, there is no exit except further expansion or destruction?

In view of these, and so many other, vital questions, any empirical honing of the direct evidence for cultural transitions in and around an origin of agriculture must be of inestimable value. Donald Henry presents a coherent and interesting account of the best documented origin of agriculture in the Levant. It may be trapped—but so are all theories—in certain conceptual locks, some transported from my profession, but it presents a coherent mode of analysis for recognition and critique; and science advances as much by exploring a variety of theoretical approaches as by accumulating information.

Henry organizes his material by focusing on a transitional step in the origin of agriculture in the Levant: the peoples and cultures associated with the Natufian archaeological “complex”—or distinctive assemblage of stone artifacts. (The book is largely a technical monograph, filled with pages of description and pictures of artifacts; this is not airplane or bedtime reading.) The archaeological record of the pre-Natufian complexes that precede the Natufians show a life of nomadic hunting and gathering; the post-Natufian sites include evidence of transitions to full agriculture. Radiocarbon dating suggests a range of Natufian localities from nearly 12,500 years BP (before the present) to about 10,000 years BP. (We no longer speak of a lockstep progression, implying full transition of all people from one stage to another. We must recognize a complex mosaic of numerous groups—partly to completely independent of one another—each interacting with a varied and changing climate. Some environments within a region may facilitate a transition to a different stage, while other groups may remain well adapted by maintaining previous ways of living in adjacent environments. Thus, while people of some subregions adopt the practices of the Natufians, other contemporaries retain their hunting and gathering style of life.)

We know nothing of the ethnic identity of these Natufian peoples, though continuity of habitation in the Levant suggests that they may include ancestors of modern Semitic groups. The Natufian people developed a new mode of life that functioned like the fulcrum on a teeter-totter that must either fall back on the ways of hunting and gathering or forward into an agricultural world of no return.

The Natufians were “complex foragers,” in contrast with the “simple foraging” of earlier nomadic hunters and gatherers. That is, they found sufficient richness of exploitable resources—mainly gazelles for hunting, and nuts and wild cereals for plant foods—that they could store food, remain in one place, establish permanent settlements, grow in population, and produce surpluses with all the familiar consequences of accumulated wealth and social rankings.

In short, they could experience, if at somewhat tenuous levels, all the traditional and fateful consequences of agriculture, while not growing plant food by cultivation and still relying on gathering natural products. Such a mode of life seems inherently unstable in the long run. Agriculture is tenuous enough, but it at least works by explicit cultivation. How long can a society last that gathers and exploits, but does not husband and redeploy? All the moas of New Zealand fell victim to early Maori settlers; and the gods of the great stone statues could not save Easter Island once all the timber had been cut.


What then triggered the Levantine transition from simple to complex foraging; and what caused the later propulsion, in part through inherent instability in the long run, of at least some complex foraging communities to agriculture? Older explanations for this and similar transitions favored an internally driven propulsion based on some notion of gradual and progressive improvement—people sufficiently smart to learn enough botany for successful foraging must eventually sense the advantages of a more secure supply through cultivation.

In presenting an alternative view, Henry works with the same dichotomy between internal and external causes that has permeated nearly two hundred years of debate in my own field of evolutionary theory—an interesting example of how interdisciplinary transfer can either illuminate by expansion or restrict by inappropriate copying. Evolutionary theories of change are fundamentally internalist or externalist. The classical internalist theories propose a push from within the organism—a necessary impetus from the chemical nature of genetic change or (in popular pre-Darwinian versions) from an inherent tendency in vital matter to greater complexity of organization. Darwin’s theory of natural selection, by contrast, is basically externalist: environments change, setting new pressures of natural selection, and organisms adapt. Darwinian evolution is adaptation to changeing local environments, not inherent progress or inherent anything.

Henry has flowed with the Darwinian spirit and adopted an externalist view of archaeological transition. While not denying that some cultural practices may restrict the extent of future change for structural and internal reasons, Henry views changes in climate as the primary impetus for both key transitions—from simple to complex foraging, and from the instability of complex foraging to agriculture (at least in some settlements). Substantial regional change in climate characterized our planet at this labile period surrounding the termination of the last great episode of continental glaciation. As the glaciers retreated, the climate became warmer and wetter. This first climatic change brought extensive food resources of wild grains and nuts into broad regions of the Levant, and made complex foraging possible by the potential bounty thus provided. According to Henry a second climate change then restricted the natural growth of such abundant foodstuffs and either drove populations back to the world of nomadic foraging, or forced them to a more active contest with nature by growing their own food in appropriate pockets of a drier climate.

This scheme raises an obvious conceptual dilemma, faced squarely by Henry. Why bother to postulate a second climatic change for the transition to agriculture when the Natufian system of complex foraging was, for its own internal reasons, inherently unstable, because growing populations in enlarging settlements cannot support themselves forever on nature’s unvarnished bounty alone? Once the crucial straw has been added to its burden, you don’t need to whack the poor camel as well. Henry senses the conceptual tension involved in postulating external pushes for internally collapsing systems:

Quite apart from the impact of the drastic decline in resources associated with the deteriorating climate, the Natufian would have inevitably collapsed as a result of the combined tensions between intensified collection, population growth, and a fixed ceiling on natural resources.

Henry’s conclusion, with irrefutable justice, rests on the empirical record: the causal mechanism needs no climatic push in theory, but such an environmental change happened to occur in the Levant at the crucial time: “A complex foraging strategy was especially vulnerable to the kinds of environmental perturbations that are so common to the region and to which simple foragers were virtually immune.”

This ultimate, and proper, reliance upon unique occurrences of a complex history raises another question about conceptual locks in the overly willing acceptance of procedures from other disciplines (crossing to the wrong side of the fine line between imported insight and inappropriate fashion). Henry describes his scenario for the origin of agriculture in the Levant as a “model”—borrowing from the more prestigious “hard sciences” the terminology of a general theoretical structure that renders a particular outcome as a deducible consequence of natural laws. But agriculture in the Levant is a historical entity, a complex and singular event, not an instantiation of general laws.

You don’t need to produce a model to understand why Lee sent his men across a mile of open territory (the misnamed Pickett’s Charge), leading to the loss of the Battle of Gettysburg and ultimate defeat for the Confederacy. I do not claim either that unique historical events have no rigorous explanations, or that general principles of sociology and military logistics are irrelevant to complex particulars. I only hold that narrative explanations in history work differently from lawlike deductions in celestial mechanics, that the jargon of “modeling” belongs more to the latter, and that agriculture in the Levant is more a thing than an instance.

Henry continually traps himself in borrowed conceptual schemes, escaping by good sense, but only with difficulty because language proves to be so restrictive. For example, the sequence of simple foraging to complex foraging to agriculture becomes a progressive continuum because tradition dictates this form of speech and iconography. Yet all Henry’s descriptions cry out for a completely different concept—a scheme in which complex foraging is a break point, or a crux, not an intermediate point or stage. Complex foraging is inherently unstable, for it implies expanding demands in a system with a firm limit on exploitable resources. It must either break and revert to simple foraging or remove the limit by inventing agriculture. We must reject a picture of stages following one another in an arrow of progress. We need a different iconography—complex foraging as the fulcrum of a seesaw, or the apex of a triangle.

Henry’s descriptions seem to cry out for the noncontinuationist scheme that he never really embraces. He writes, for example, that while simple foraging grades into complex foraging in reasonably smooth continuity across broad regions, complex foraging did not pass to agriculture in the same way, or really at all. Some Natufian groups reverted to simple foraging at the second climatic shift. But most Natufian hamlets were decimated, not enlarged. By five hundred years after the onset of the second climatic shift, eighteen of the twentythree known Natufian hamlets had been abandoned. Agriculture was the lucky invention or desperation move of a few Natufian settlements, not the next predictable stage in a lockstep of human progress.

As a second example of linguistic trapping, Henry maintains a lavish commitment to the Darwinian icon of “adaptation.” He trots out the word so frequently, and often so vacuously, that he reminds me of my old shipmates who used a certain adjective beginning with “f” before every noun, thus doubling the lengths of their sentences, and adding no meaning whatever. Everything that people do in Henry’s Levant is part of an adaptive system. Of course, I will not dispute the claim when limited to some trivial, vernacular sense. The systems worked, at least for a time, so they must have been adaptive in this almost tautological meaning. But by calling everything part of an adaptive something-or-other, Henry misses the exquisite tension that seems to pervade his Levantine world.

Adaptation, in the vernacular sense of working or fitting, characterizes entities at various levels of an inclusive hierarchy—and the various levels are often in overt and ultimately tragic conflict. What’s good (“adaptive”) for the organism may destroy the group in the long run. Baseball may die because individual players pursue their own advantages so singlemindedly. What’s good (“adaptive”) for the group may require the suppression of personal benefit. America may die because Japan understands the balance of individual and corporate good far better than we do. Adaptation is not a unitary harmony.

What else is complex foraging but a classic case of immediate adaptation for individuals leading to sure long-term destruction for the group? So long as the nuts, cereals, and gazelles abound, why not take them, store them, and prosper (an “adaptive system,” in Henry’s terminology)? But the Natufians paid the price—and in a direct manner that must have involved much emotional wrenching in people who were, after all, much like us. An analysis of sex ratios in burial sites indicates that late Natufian communities introduced female infanticide on an extensive scale. This is a classic, oftrepeated, method of population control—probably the only effective mode then available. I can spout cultural relativism as well as the next person, but I can’t imagine that such a solution has ever been adopted without great pain, atleast for mothers. Presumably, the Natufians practiced infanticide because their populations, burgeoning in the adaptive system of their complex foraging, had outstripped resources. What had been good for individuals became bitter for the group.

As Odin gave an eye for knowledge, I would give one to be a fly on the wall when the Natufian elders recognized what their people had wrought and debated the means of restoration. Who proposed agriculture as the way out? Was he murdered as a madman (at least in eighteen of twenty-three hamlets), or was she acclaimed as a prophetess? Was the foundation of our civilization born in desperation, and adopted as a curse of hard work (much like Adam and Eve after their expulsion from Eden), because our forebears were no better than us in balancing the benefits of groups and individuals?

This Issue

January 18, 1990