mithen_1-102209.jpg

Greg WoodAFP/Getty Images

A chimpanzee mother and daughter eating pieces of a hand-carved pumpkin at the Taronga Zoo, Sydney, Australia, October 31, 2005.

Who’s cooking your dinner? Who’s looking after your kids? If you are a man, it is probably the woman—or women—in your life. You know that women mainly do the daily domestic grind. And so it has been, not just throughout history but also throughout the last two million years of human evolution, according to Richard Wrangham, author of Catching Fire, and Dean Falk, author of Finding Our Tongues.

There was once a time—not too long ago—when men could wallow with pride in the Stone Age accomplishments of our sex. It was slaying beasts, making tools, and fighting each other that transformed a Stone Age primate, physically and mentally little different from a chimpanzee, into the big-brained language-using primate that strode out of Africa to dominate the world. How lucky for women that their Stone Age menfolk were so brave and clever.

It is now men who need to nod in acknowledgment at the accomplishments of the Stone Age women who undertook the cooking and childcare. For according to Wrangham and Falk, it is those activities that provided the causes and conditions for the evolution of large brains and language. Women should particularly appreciate these two fascinating books about our evolutionary past.

The evolutionary history of our species is by far the best story ever to be told. It is one that needs continual rewriting and retelling as our knowledge of the fossil and archaeological records improves, as the genomes of humans, apes, and monkeys are revealed and compared, as neuroscience penetrates the working of the brain, and as we appreciate the evolutionary significance of activities that have previously been neglected, cooking and childcare being the two cases in point. While the details remain under debate, astonishing progress has been made in our understanding of human origins ever since Darwin explained how natural selection works and the first human fossils were found 150 years ago.

The common ancestor of humans and chimpanzees lived in Africa between seven and five million years ago, a date established by DNA studies rather than the discovery of fossil remains. We have found fossilized fragments of pre-human bone substantial enough for meaningful interpretation that can be dated only as far back as 4.5 million years ago. Anthropologists have found the remains of a diverse range of hominid species from after that date, all slightly different in shape and size, testifying to multiple adaptations to different niches in the African landscapes. Some became highly specialized, exploiting dry seeds and other plant foods, which resulted in species that had a physiology not unlike that of gorillas today but with chimpanzee-sized brains (circa 450cc).

Natural selection took others in a different direction so that after 2.5 million years ago they were adept at making stone tools, reliant on meat eating, and committed to walking on two legs. These species are referred to as habilines (after one of their members, Homo habilis, with a brain size of up to circa 650cc). From that group one species emerged at around two million years ago that looked and behaved far more human-like than all others, one that we now refer to as Homo ergaster, with a brain size approaching 1,000cc. It was this species (we think) that first left the African continent, dispersing into Southeast Asia and Europe while human evolutionary processes were ongoing both within and outside of the African continent. Further dispersals from Africa followed.

By 500,000 years ago, brain size had reached modern-day capacities (on average between 1,300cc and 1,500cc, with significant variation for body size), although there remains no evidence that these species engaged in those peculiarly human activities of art and religion. Homo neanderthalensis—Neanderthal man—evolved in Europe soon after 350,000 years ago, while in Africa Homo sapiens—our species—appeared around 200,000 years ago, these species probably sharing Homo heidelbergensis, a descendant of Homo ergaster, as a common ancestor. The first traces of unambiguous use of symbols appeared in South Africa around 100,000 years ago, and after 70,000 years ago a great diaspora of Homo sapiens began that took our species to the ends of the earth—Tasmania by 30,000 years ago and Tierra del Fuego by 11,000 years ago.

By 25,000 years ago we had become the lonely species, the only member of our genus to survive on the planet except for the “hobbits,” Homo floresiensis, on Flores Island, who went the way of the Neanderthals once modern humans arrived on their island around 12,000 years ago. Why are we the sole surviving member of our genus? My guess is that it was Homo sapiens alone that evolved fully modern language.

The peak of the last ice age came 21,000 years ago, further inspiring remarkable artistic achievements, such as cave paintings at Lascaux, in several parts of the globe from Europe to Australia. Dramatic climate change 11,550 years ago left modern humans in a world that was warmer, wetter, and climatically more stable than they had ever experienced. The cultivation of plants and domestication of animals arose quite independently in several parts of the world; agricultural lifestyles with towns and trade followed, as did civilizations and empires. History began while human evolution continued, as it does today.

Advertisement

The human story is so far without an end, but is probably heading for inevitable global catastrophe. The key question is how our denouement will come about: Will it be by nuclear devastation, man-made global warming, or biological epidemic? What a truly remarkable species we are to have provided ourselves with such options. According to Wrangham we should blame this predicament on our ancestral diet; according to Falk, we can blame our ancestors for having been too kind-hearted to their babies.

Documenting this evolutionary story is challenging enough. It is a much greater task to try to explain how we began as just one of several run-of-the-savannah primates in Africa yet became the only creature introspective about its own past and future. Anthropologists have looked toward modern-day hunter-gatherers, once thought to be direct relicts of the Stone Age, and sought to find the single evolutionary shift that differentiated our species from our ape and monkey relatives. During the 1950s and 1960s “man the hunter” and “man the tool-maker” became popular ideas—the word “man” being significant beyond the colloquial designation of our species. Hunting and tool-making were espoused as activities that are unique to humans and were responsible for all things good: bipedal walking, big brains, pair-bonding. Then along came Jane Goodall, who took the time to actually find out how chimpanzees lived. She discovered that they also make tools, hunt animals, and eat meat. She found that chimpanzees also brutally kill members of their own species.

During the last two decades evolutionary anthropologists have shifted their attention away from the challenges that faced our ancestors when interacting with the physical world to those of the social world. Rather than worrying about how the habilines of some 2.5 million years ago managed to locate ripe fruit and swollen tubers, ambushed antelope and produced stone flakes to butcher a carcass, anthropologists have become concerned with how they negotiated the complexities of living in relatively large social groups. This increase in group size compared to that of their forest-dwelling ancestors appears to have been a perquisite for the habilines, who needed to find a means of defense against predators in their increasingly open savannah habitats without the protection that trees provide. But who to mate with and who to trust? Who to back in a fight and to whom should one be subdued? Who to share food with? Who to avoid and who to cheat?

The complex politics of chimpanzee societies, as revealed from the studies of Franz de Waal and others, has lent substantial credence to the claim that it would have been those individuals who could think rather than fight their way through the social maze that were ultimately rewarded with reproductive success. It was they who set our species on course for a bigger brain. Such thought is likely to have required enhanced abilities of understanding different minds, or at least knowing that the beliefs and desires of another individual are different from one’s own. This does indeed appear to be something that is lacking to any significant extent among chimpanzees.

But one can’t just think one’s way to a bigger brain: one has to eat, and big brains are particularly hungry. While complex social life may have provided the selective pressure for the evolution of bigger brains, this could be realized only if sufficient food could be consumed to provide the required energy. How could that have been achieved? By catching fire and cooking food is Richard Wrangham’s answer.

He tells a simple but convincing story. It draws not only on what we know about the fossil record for changing anatomy, observations of modern-day primates, and the behavior of hunter-gatherers but also the science of cooking, digestion, and nutrition. The essence of his argument is that the evolutionary path from ape to human, to big brains and pair-bonding, could only have begun once our ancestors began to cook their food. The simple reason is that cooked food, whether vegetables or meat, provides significantly more energy than the same foodstuffs in their raw state. Cooked food is easier and quicker to digest; it can be digested with a smaller gut, releasing metabolic energy for a larger brain; it opens up a more diverse natural larder than is available to those reliant on raw foods; the reduced time and strength required for chewing allows the size of the jaw and teeth to be reduced, freeing the oral capacity for a greater range of vocalizations. Cooking literally provided the meat for the male–female pair-bonded sandwich on which human social organization is based.

Advertisement

Wrangham writes with the authority of someone who has personally made many of the studies on chimpanzees and hunter-gatherers to which he refers and with the pleasure of having drawn an assortment of scientific studies and anecdotes from academic obscurity to public attention. So he recounts his meetings with raw-foodists, those dedicated to eating their food raw in the mistaken belief that this provides a more natural and healthier diet than cooked food. He describes having dinner with two members of their militant wing, who style themselves as “instinctotherapists.” They mimic chimpanzee feeding habits by taking only one type of vegetable at a time and eating their meat raw, often in the form of marrow taken straight from the bone.

Wrangham finds that raw-foodists, including those who have been forced into this diet by being shipwrecked or lost in the jungle, can be perfectly healthy. But they are always thin and frequently seem tired: eating raw food is an ideal way to lose weight because relatively little energy can be extracted from it during digestion compared to cooked food. A raw food diet is just about feasible for modern-day, urban-living people who can drive to work and stroll to their supermarket to find the very best fruit and vegetables. But such a diet is likely to be impossible for anyone else: there are no accounts of “traditional people” who survived on raw foods alone.

mithen_2-102209.jpg

Marc Riboud/Magnum Photos

Julia Child, Provence, 1969

The human body is adapted to eating cooked food: small mouths, weak jaws, small teeth, small stomachs, small colons—less than 60 percent of the mass expected for a primate of our size. Small guts overall. The same certainly goes for the chewing equipment of the habilines and ergaster, while the shape of the ergaster pelvis and rib cage indicates that its guts were also small, quite different from those of apes that are reliant on raw food. Wrangham stresses that these many physiological adaptations are not merely a consequence of eating a higher proportion of raw meat in the diet. It is just too tough, requiring too much chewing for small mouths and teeth to cope with. Moreover, even if raw meat provided half of the Stone Age diet, the quantity of plant foods additionally required could not have been digested by the small stomachs and colons of our two-million-year-old ancestors: they just had to have been cooking. Wrangham argues that this not only had an impact on their physiology, it also led to—perhaps required—a fundamental change in the nature of social relationships.

Wrangham imagines the habiline and ergaster past as one in which males engaged in hunting and females in plant gathering and cooking. This is not unreasonable, partly because it is a universal pattern among all known hunter-gatherers and partly because it makes evolutionary ecological sense. Hunting and scavenging on African savannahs demanded energy, and required walking and running over long distances, often with low rates of success. We know from the archaeological sites of butchered animal bones and stone tools that such hunting and/or scavenging did indeed take place. Without cooked food, whether meat or vegetable, the energy and time required for such activity would simply not have been available. Once cooked food was on the menu, those individuals born by chance with smaller guts and bigger brains could thrive, using their greater intelligence to outthink others in their socially competitive world, and passing on their genes to the next generation.

What was in it for the women? Were they not only gathering plants all day, along with grubs, lizards, and the like, but also tending the fire and cooking for when the hunters returned? Wrangham’s answer is intriguing, but not wholly convincing to me: they received protection. Once food is cooked its value is so enhanced that it becomes liable to theft. The waft of sizzling antelope steak across the savannah would have been seductive to stray Homo ergaster males. So in exchange for the provision of cooked food, the males agreed to provide protection for the females, or rather the particular female providing the cooked dinner. In this regard, the basis of the pair-bonding relationship is not about the exchange of meat for sex, as traditionally assumed, but of security for cooked food. As Wrangham describes early human life, sex was far easier to find than a good woman prepared to cook your dinner.

Possibly, but there is one ingredient that seems to be too sparse in Wrangham’s recipe for the origin of human society: children. Although Wrangham acknowledges how providing soft, cooked foods would have enabled early weaning, recovery of the mothers, and return to childbirth sooner than otherwise, he sprinkles the infant factor through his book as a sort of garnish rather than part of the staple ingredients. If cooked food has all of the nutritional benefits that Wrangham describes, then surely a motivation—perhaps the prime one—for cooking by women would have been to feed the kids. Getting them strong and healthy as quickly as possible after weaning would have been reward enough for cooking food; feeding the man in your life would have been an afterthought. But maybe I am too influenced here by my own experience of married life.

Dean Falk places the mother–infant relationship at the center of her argument about the evolution of language. Hers is also a simple and largely convincing narrative, written with a similar mix of academic authority, anecdotes, and flair: walking on two legs—bipedalism—required humans to have a narrow pelvis. As a consequence, childbirth and parenting became more challenging because babies had to be born relatively immature to fit through the narrow birth canal.

To compound matters, the evolution of bipedalism (and/or use of fire) coincided with the loss of body hair, depriving those immature babies of anything to hold onto when clambering over their mothers’ bodies. For mothers, the babies had to be carried around. As the newborns grew into infants, remaining effectively helpless, they became heavier and so it was no longer feasible for mothers to carry them around all day—especially before the invention of baby slings. Habiline and ergaster mothers had no choice but to put babies down on the ground while gathering their plants, making their tools, and—according to Wrangham—cooking the dinner. Babies then, as now, cried. Mothers coo-cooed at them, a verbal surrogate for the physical contact the babies desired.

So emerged “motherese,” the singsong-like baby-talk that is universally present today, which should be referred to as “infant-directed speech” so as not to upset all of those good fathers who also coo-coo to their babies. Motherese, Falk argues, lies at the root of spoken language—language that has a lexicon of words and complex grammatical rules. So just as Wrangham argues that natural selection favored those adults with relatively small guts and large brains, so Falk argues that it favored the slow developers, those babies whose heads remained relatively small until after they had been born. It wasn’t only the babies who had large heads that died in childbirth: natural selection made sure, in Falk’s view, that their genes would not survive into the next generation by taking their would-be mothers with them.

Fire is at the center of both of these evolutionary scenarios. Both Wrangham and Falk recognize that fire was needed for protection from predators and for warmth when sleeping on the ground during the night, as habilines are likely to have done after having lost their anatomical adaptations for efficient tree climbing. But if fire was so routine and important for our ancestors, why should traces of it be so rare in the archaeological record? It is only with the late Neanderthals in Europe and the Near East (after 60,000 years ago) and the first modern humans in Africa that clearly defined hearths are found.

Admittedly fires can often leave no more than a pile of ash that could blow away in an instant. But when we have near perfectly preserved surfaces at locations that were evidently used on multiple occasions for butchering animals and making stone tools, such as at 500,000-year-old Boxgrove in Sussex, UK, why is there no trace of a built hearth, burnt sand, or soil? In light of all of the advantages of fire that Wrangham describes—keeping warm, deterring predators, cooking food—it seems inconceivable that fire would not have been used at such locations. But there isn’t a trace of it.

Fireplaces built of stone and cooking pits are pervasive in the archaeological record of modern human hunter-gatherers; when they are not found, their nearby presence (beyond the edge of the excavation) or previous existence is quite evident from burnt artifacts, burnt bones, and fragments of charcoal. Why is such evidence so extraordinarily rare prior to 100,000 years ago? So if pre-modern humans as long ago as two million years were using fire and cooking as Wrangham suggests, they may have been doing so in a quite different manner from modern humans, which would then challenge his use of ethnographic accounts of living hunter-gatherers as models for the distant past. It all depends on how much attention one pays to the archaeological evidence or lack of it. I worry that Wrangham gives it too short shrift.

Falk does much worse by casually citing Robert Bednarik’s claims that a few scratches made on bones some 300,000 years ago constitute symbolic art. The first half of her book, and all of her previous publications, show that Falk is an astute academic, one able to make careful evaluations of conflicting claims about the fossil record. And yet she seems to lose all of her critical faculties when faced with the extravagant claims from Bednarik about a few scratches on pieces of bone, his work being distinguished by a lack of academic rigor. There is, in fact, no evidence to support her claim that visual art evolved hand in hand with language, if indeed language did show a gradual evolutionary history from its two-million-year-old beginnings as motherese to having words and grammar by the time of the modern human diaspora after 70,000 years ago.

My own view is that the archaeological evidence indicates otherwise: we don’t have any evidence for activities mediated by language—i.e., with words and grammar—until after the emergence of modern humans 200,000 years ago. This isn’t to say that ergaster, the Neanderthals, floresiensis, and other pre-modern humans lacked complex sociality and communication but just that their vocal utterances and gestures were of a quite different type from the language we are familiar with.

Ultimately it is the overall significance of motherese in language evolution that must be questioned. Falk treats motherese as a “magic bullet”: once such emotionally charged communication is present, not only the rest of language inevitably evolved but so did all other higher cognitive functions. In making this argument she draws heavily on language development in the modern-day child, promoting what she describes as a “Haeckel Lite” account of recapitulation—a modified version of the idea originally proposed by the nineteenth-century German zoologist Ernst Haeckel, that the study of developmental change in individuals can shed light on evolutionary change. While Falk acknowledges that some developmental linguists dispute the idea that motherese is wholly, or even partially, about language acquisition, she does not tackle the issue that unlike modern babies, those of habilines and ergaster were not developing within a community that already was using language.

Cooking risks becoming a magic bullet for Wrangham. Having used it to explain the anatomical and behavioral evolution of the habilines, he sticks with it for our more recent ancestors and relatives, suggesting that advances in food preparation may have contributed to the continuing rise in brain size through two million years of human evolution. Yes, it must have contributed, but my gut feeling—and in light of Wrangham’s explanation that our brains depend on our guts that term takes on a new significance—is that after cooking made its contribution to the evolution of ergaster, we need to look to other factors to understand the latter stages of human evolution.

Both of these books argue that we can only understand ourselves today in the context of our Stone Age past. Wrangham is explicit, arguing that the provision of meat and more significantly protection in return for cooked food created a male–female bond that was “so critical for the successful feeding of both sexes that they generated a particular kind of evolutionary psychology in our ancestors that shaped female-male relationships and continues to affect us today. ” Elsewhere he is more frank:

Cooking freed women’s time and fed their children, but it also trapped women into a newly subservient role enforced by male dominated culture. Cooking created and perpetuated a novel system of male cultural superiority. It is not a pretty picture.

Well, maybe. But I would really have liked to have read an account of the impact of settled farming lifestyles with diets dominated by cereals and legumes on the evolved, and presumably still-evolving, psychology and social relationships. Perhaps the male–female relationships just got a lot more ugly. Falk’s message is that studies of primates and inferences from the fossil record demonstrate an overarching need of infants for physical contact: “human parents everywhere would do well to pay more attention to this need.” And by implication, if you can’t provide the cuddles because you are attending to the cooking or to some other task, then provide the best alternative by talking, and even better by singing, to your baby.

The failure to provide such verbal, gestural, or physical contact results in unhappy and potentially socially maladjusted children inhibited in their language capabilities. The failure to take account of what we eat results in obesity, heart disease, and other ailments that are so prominent today in affluent societies. But as Wrangham explains in his final chapter, the blame only lies partly with the consumer. Nutritional science simply has not fully taken into account the fact that cooking and other forms of processing make food easier to digest. We can extract more calories from the highly processed food that is readily available today. But the current system of food labeling misleads consumers into “thinking that they will get the same number of calories from a given weight of macronutrients regardless of how it has been prepared.”

Both books provide delicious food for reflection. They show how understanding ourselves today—our bodies, our minds, our behavior—can be achieved only by understanding our evolutionary past. I am fully persuaded by Wrangham and Falk that cooking and childcare did indeed play key parts in making us what we are today. But these don’t provide the whole story. According to their arguments, all those other bipedal, large-bodied, and large-brained ancestors and relatives—Homo heidelbergensis and Homo neanderthalensis among them—would have also been cooking their food and coo-cooing to their infants. What made Homo sapiens different? Why are we the only human species alive on the planet today? As with any really good meal, one is soon hungry for more.

This Issue

October 22, 2009