• Email
  • Single Page
  • Print

Freedom Through Cooking

mithen_2-102209.jpg
Marc Riboud/Magnum Photos
Julia Child, Provence, 1969

The human body is adapted to eating cooked food: small mouths, weak jaws, small teeth, small stomachs, small colons—less than 60 percent of the mass expected for a primate of our size. Small guts overall. The same certainly goes for the chewing equipment of the habilines and ergaster, while the shape of the ergaster pelvis and rib cage indicates that its guts were also small, quite different from those of apes that are reliant on raw food. Wrangham stresses that these many physiological adaptations are not merely a consequence of eating a higher proportion of raw meat in the diet. It is just too tough, requiring too much chewing for small mouths and teeth to cope with. Moreover, even if raw meat provided half of the Stone Age diet, the quantity of plant foods additionally required could not have been digested by the small stomachs and colons of our two-million-year-old ancestors: they just had to have been cooking. Wrangham argues that this not only had an impact on their physiology, it also led to—perhaps required—a fundamental change in the nature of social relationships.

Wrangham imagines the habiline and ergaster past as one in which males engaged in hunting and females in plant gathering and cooking. This is not unreasonable, partly because it is a universal pattern among all known hunter-gatherers and partly because it makes evolutionary ecological sense. Hunting and scavenging on African savannahs demanded energy, and required walking and running over long distances, often with low rates of success. We know from the archaeological sites of butchered animal bones and stone tools that such hunting and/or scavenging did indeed take place. Without cooked food, whether meat or vegetable, the energy and time required for such activity would simply not have been available. Once cooked food was on the menu, those individuals born by chance with smaller guts and bigger brains could thrive, using their greater intelligence to outthink others in their socially competitive world, and passing on their genes to the next generation.

What was in it for the women? Were they not only gathering plants all day, along with grubs, lizards, and the like, but also tending the fire and cooking for when the hunters returned? Wrangham’s answer is intriguing, but not wholly convincing to me: they received protection. Once food is cooked its value is so enhanced that it becomes liable to theft. The waft of sizzling antelope steak across the savannah would have been seductive to stray Homo ergaster males. So in exchange for the provision of cooked food, the males agreed to provide protection for the females, or rather the particular female providing the cooked dinner. In this regard, the basis of the pair-bonding relationship is not about the exchange of meat for sex, as traditionally assumed, but of security for cooked food. As Wrangham describes early human life, sex was far easier to find than a good woman prepared to cook your dinner.

Possibly, but there is one ingredient that seems to be too sparse in Wrangham’s recipe for the origin of human society: children. Although Wrangham acknowledges how providing soft, cooked foods would have enabled early weaning, recovery of the mothers, and return to childbirth sooner than otherwise, he sprinkles the infant factor through his book as a sort of garnish rather than part of the staple ingredients. If cooked food has all of the nutritional benefits that Wrangham describes, then surely a motivation—perhaps the prime one—for cooking by women would have been to feed the kids. Getting them strong and healthy as quickly as possible after weaning would have been reward enough for cooking food; feeding the man in your life would have been an afterthought. But maybe I am too influenced here by my own experience of married life.

Dean Falk places the mother–infant relationship at the center of her argument about the evolution of language. Hers is also a simple and largely convincing narrative, written with a similar mix of academic authority, anecdotes, and flair: walking on two legs—bipedalism—required humans to have a narrow pelvis. As a consequence, childbirth and parenting became more challenging because babies had to be born relatively immature to fit through the narrow birth canal.

To compound matters, the evolution of bipedalism (and/or use of fire) coincided with the loss of body hair, depriving those immature babies of anything to hold onto when clambering over their mothers’ bodies. For mothers, the babies had to be carried around. As the newborns grew into infants, remaining effectively helpless, they became heavier and so it was no longer feasible for mothers to carry them around all day—especially before the invention of baby slings. Habiline and ergaster mothers had no choice but to put babies down on the ground while gathering their plants, making their tools, and—according to Wrangham—cooking the dinner. Babies then, as now, cried. Mothers coo-cooed at them, a verbal surrogate for the physical contact the babies desired.

So emerged “motherese,” the singsong-like baby-talk that is universally present today, which should be referred to as “infant-directed speech” so as not to upset all of those good fathers who also coo-coo to their babies. Motherese, Falk argues, lies at the root of spoken language—language that has a lexicon of words and complex grammatical rules. So just as Wrangham argues that natural selection favored those adults with relatively small guts and large brains, so Falk argues that it favored the slow developers, those babies whose heads remained relatively small until after they had been born. It wasn’t only the babies who had large heads that died in childbirth: natural selection made sure, in Falk’s view, that their genes would not survive into the next generation by taking their would-be mothers with them.

Fire is at the center of both of these evolutionary scenarios. Both Wrangham and Falk recognize that fire was needed for protection from predators and for warmth when sleeping on the ground during the night, as habilines are likely to have done after having lost their anatomical adaptations for efficient tree climbing. But if fire was so routine and important for our ancestors, why should traces of it be so rare in the archaeological record? It is only with the late Neanderthals in Europe and the Near East (after 60,000 years ago) and the first modern humans in Africa that clearly defined hearths are found.

Admittedly fires can often leave no more than a pile of ash that could blow away in an instant. But when we have near perfectly preserved surfaces at locations that were evidently used on multiple occasions for butchering animals and making stone tools, such as at 500,000-year-old Boxgrove in Sussex, UK, why is there no trace of a built hearth, burnt sand, or soil? In light of all of the advantages of fire that Wrangham describes—keeping warm, deterring predators, cooking food—it seems inconceivable that fire would not have been used at such locations. But there isn’t a trace of it.

Fireplaces built of stone and cooking pits are pervasive in the archaeological record of modern human hunter-gatherers; when they are not found, their nearby presence (beyond the edge of the excavation) or previous existence is quite evident from burnt artifacts, burnt bones, and fragments of charcoal. Why is such evidence so extraordinarily rare prior to 100,000 years ago? So if pre-modern humans as long ago as two million years were using fire and cooking as Wrangham suggests, they may have been doing so in a quite different manner from modern humans, which would then challenge his use of ethnographic accounts of living hunter-gatherers as models for the distant past. It all depends on how much attention one pays to the archaeological evidence or lack of it. I worry that Wrangham gives it too short shrift.

Falk does much worse by casually citing Robert Bednarik’s claims that a few scratches made on bones some 300,000 years ago constitute symbolic art. The first half of her book, and all of her previous publications, show that Falk is an astute academic, one able to make careful evaluations of conflicting claims about the fossil record. And yet she seems to lose all of her critical faculties when faced with the extravagant claims from Bednarik about a few scratches on pieces of bone, his work being distinguished by a lack of academic rigor. There is, in fact, no evidence to support her claim that visual art evolved hand in hand with language, if indeed language did show a gradual evolutionary history from its two-million-year-old beginnings as motherese to having words and grammar by the time of the modern human diaspora after 70,000 years ago.

My own view is that the archaeological evidence indicates otherwise: we don’t have any evidence for activities mediated by language—i.e., with words and grammar—until after the emergence of modern humans 200,000 years ago. This isn’t to say that ergaster, the Neanderthals, floresiensis, and other pre-modern humans lacked complex sociality and communication but just that their vocal utterances and gestures were of a quite different type from the language we are familiar with.

Ultimately it is the overall significance of motherese in language evolution that must be questioned. Falk treats motherese as a “magic bullet”: once such emotionally charged communication is present, not only the rest of language inevitably evolved but so did all other higher cognitive functions. In making this argument she draws heavily on language development in the modern-day child, promoting what she describes as a “Haeckel Lite” account of recapitulation—a modified version of the idea originally proposed by the nineteenth-century German zoologist Ernst Haeckel, that the study of developmental change in individuals can shed light on evolutionary change. While Falk acknowledges that some developmental linguists dispute the idea that motherese is wholly, or even partially, about language acquisition, she does not tackle the issue that unlike modern babies, those of habilines and ergaster were not developing within a community that already was using language.

Cooking risks becoming a magic bullet for Wrangham. Having used it to explain the anatomical and behavioral evolution of the habilines, he sticks with it for our more recent ancestors and relatives, suggesting that advances in food preparation may have contributed to the continuing rise in brain size through two million years of human evolution. Yes, it must have contributed, but my gut feeling—and in light of Wrangham’s explanation that our brains depend on our guts that term takes on a new significance—is that after cooking made its contribution to the evolution of ergaster, we need to look to other factors to understand the latter stages of human evolution.

Both of these books argue that we can only understand ourselves today in the context of our Stone Age past. Wrangham is explicit, arguing that the provision of meat and more significantly protection in return for cooked food created a male–female bond that was “so critical for the successful feeding of both sexes that they generated a particular kind of evolutionary psychology in our ancestors that shaped female-male relationships and continues to affect us today. ” Elsewhere he is more frank:

Cooking freed women’s time and fed their children, but it also trapped women into a newly subservient role enforced by male dominated culture. Cooking created and perpetuated a novel system of male cultural superiority. It is not a pretty picture.

Well, maybe. But I would really have liked to have read an account of the impact of settled farming lifestyles with diets dominated by cereals and legumes on the evolved, and presumably still-evolving, psychology and social relationships. Perhaps the male–female relationships just got a lot more ugly. Falk’s message is that studies of primates and inferences from the fossil record demonstrate an overarching need of infants for physical contact: “human parents everywhere would do well to pay more attention to this need.” And by implication, if you can’t provide the cuddles because you are attending to the cooking or to some other task, then provide the best alternative by talking, and even better by singing, to your baby.

The failure to provide such verbal, gestural, or physical contact results in unhappy and potentially socially maladjusted children inhibited in their language capabilities. The failure to take account of what we eat results in obesity, heart disease, and other ailments that are so prominent today in affluent societies. But as Wrangham explains in his final chapter, the blame only lies partly with the consumer. Nutritional science simply has not fully taken into account the fact that cooking and other forms of processing make food easier to digest. We can extract more calories from the highly processed food that is readily available today. But the current system of food labeling misleads consumers into “thinking that they will get the same number of calories from a given weight of macronutrients regardless of how it has been prepared.”

Both books provide delicious food for reflection. They show how understanding ourselves today—our bodies, our minds, our behavior—can be achieved only by understanding our evolutionary past. I am fully persuaded by Wrangham and Falk that cooking and childcare did indeed play key parts in making us what we are today. But these don’t provide the whole story. According to their arguments, all those other bipedal, large-bodied, and large-brained ancestors and relatives—Homo heidelbergensis and Homo neanderthalensis among them—would have also been cooking their food and coo-cooing to their infants. What made Homo sapiens different? Why are we the only human species alive on the planet today? As with any really good meal, one is soon hungry for more.

  • Email
  • Single Page
  • Print