Cas Oorthuys/Nederlands Fotomuseum

Children in Amsterdam during the Dutch Hunger Winter, 1944–1945

At the end of the eighteenth century, the French naturalist Jean-Baptiste Lamarck noted that life on earth had evolved over long periods of time into a striking variety of organisms. He sought to explain how they had become more and more complex. Living organisms not only evolved, Lamarck argued; they did so very slowly, “little by little and successively.” In Lamarckian theory, animals became more diverse as each creature strove toward its own “perfection,” hence the enormous variety of living things on earth. Man is the most complex life form, therefore the most perfect, and is even now evolving.

In Lamarck’s view, the evolution of life depends on variation and the accumulation of small, gradual changes. These are also at the center of Darwin’s theory of evolution, yet Darwin wrote that Lamarck’s ideas were “veritable rubbish.” Darwinian evolution is driven by genetic variation combined with natural selection—the process whereby some variations give their bearers better reproductive success in a given environment than other organisms have.1 Lamarckian evolution, on the other hand, depends on the inheritance of acquired characteristics. Giraffes, for example, got their long necks by stretching to eat leaves from tall trees, and stretched necks were inherited by their offspring, though Lamarck did not explain how this might be possible.

When the molecular structure of DNA was discovered in 1953, it became dogma in the teaching of biology that DNA and its coded information could not be altered in any way by the environment or a person’s way of life. The environment, it was known, could stimulate the expression of a gene. Having a light shone in one’s eyes or suffering pain, for instance, stimulates the activity of neurons and in doing so changes the activity of genes those neurons contain, producing instructions for making proteins or other molecules that play a central part in our bodies.

The structure of the DNA neighboring the gene provides a list of instructions—a gene program—that determines under what circumstances the gene is expressed. And it was held that these instructions could not be altered by the environment. Only mutations, which are errors introduced at random, could change the instructions or the information encoded in the gene itself and drive evolution through natural selection. Scientists discredited any Lamarckian claims that the environment can make lasting, perhaps heritable alterations in gene structure or function.

But new ideas closely related to Lamarck’s eighteenth-century views have become central to our understanding of genetics. In the past fifteen years these ideas—which belong to a developing field of study called epigenetics—have been discussed in numerous articles and several books, including Nessa Carey’s 2012 study The Epigenetic Revolution2 and The Deepest Well, a recent work on childhood trauma by the physician Nadine Burke Harris.3

The developing literature surrounding epigenetics has forced biologists to consider the possibility that gene expression could be influenced by some heritable environmental factors previously believed to have had no effect over it, like stress or deprivation. “The DNA blueprint,” Carey writes,

isn’t a sufficient explanation for all the sometimes wonderful, sometimes awful, complexity of life. If the DNA sequence was all that mattered, identical twins would always be absolutely identical in every way. Babies born to malnourished mothers would gain weight as easily as other babies who had a healthier start in life.

That might seem a commonsensical view. But it runs counter to decades of scientific thought about the independence of the genetic program from environmental influence. What findings have made it possible?

In 1975, two English biologists, Robin Holliday and John Pugh, and an American biologist, Arthur Riggs, independently suggested that methylation, a chemical modification of DNA that is heritable and can be induced by environmental influences, had an important part in controlling gene expression. How it did this was not understood, but the idea that through methylation the environment could, in fact, alter not only gene expression but also the genetic program rapidly took root in the scientific community.

As scientists came to better understand the function of methylation in altering gene expression, they realized that extreme environmental stress—the results of which had earlier seemed self-explanatory—could have additional biological effects on the organisms that suffered it. Experiments with laboratory animals have now shown that these outcomes are based on the transmission of acquired changes in genetic function. Childhood abuse, trauma, famine, and ethnic prejudice may, it turns out, have long-term consequences for the functioning of our genes.

These effects arise from a newly recognized genetic mechanism called epigenesis, which enables the environment to make long-lasting changes in the way genes are expressed. Epigenesis does not change the information coded in the genes or a person’s genetic makeup—the genes themselves are not affected—but instead alters the manner in which they are “read” by blocking access to certain genes and preventing their expression. This mechanism can be the hidden cause of our feelings of depression, anxiety, or paranoia. What is perhaps most surprising of all, this alteration could, in some cases, be passed on to future generations who have never directly experienced the stresses that caused their forebears’ depression or ill health.

Advertisement

Numerous clinical studies have shown that childhood trauma—arising from parental death or divorce, neglect, violence, abuse, lack of nutrition or shelter, or other stressful circumstances—can give rise to a variety of health problems in adults: heart disease, cancer, mood and dietary disorders, alcohol and drug abuse, infertility, suicidal behavior, learning deficits, and sleep disorders. Since the publication in 2003 of an influential paper by Rudolf Jaenisch and Adrian Bird, we have started to understand the genetic mechanisms that explain why this is the case. The body and the brain normally respond to danger and frightening experiences by releasing a hormone—a glucocorticoid—that controls stress. This hormone prepares us for various challenges by adjusting heart rate, energy production, and brain function; it binds to a protein called the glucocorticoid receptor in nerve cells of the brain.

Normally, this binding shuts off further glucocorticoid production, so that when one no longer perceives a danger, the stress response abates. However, as Gustavo Turecki and Michael Meaney note in a 2016 paper surveying more than a decade’s worth of findings about epigenetics, the gene for the receptor is inactive in people who have experienced childhood stress; as a result, they produce few receptors. Without receptors to bind to, glucocorticoids cannot shut off their own production, so the hormone keeps being released and the stress response continues, even after the threat has subsided. “The term for this is disruption of feedback inhibition,” Harris writes. It is as if “the body’s stress thermostat is broken. Instead of shutting off this supply of ‘heat’ when a certain point is reached, it just keeps on blasting cortisol through your system.”

It is now known that childhood stress can deactivate the receptor gene by an epigenetic mechanism—namely, by creating a physical barrier to the information for which the gene codes. What creates this barrier is DNA methylation, by which methyl groups known as methyl marks (composed of one carbon and three hydrogen atoms) are added to DNA. DNA methylation is long-lasting and keeps chromatin—the DNA-protein complex that makes up the chromosomes containing the genes—in a highly folded structure that blocks access to select genes by the gene expression machinery, effectively shutting the genes down. The long-term consequences are chronic inflammation, diabetes, heart disease, obesity, schizophrenia, and major depressive disorder.

Such epigenetic effects have been demonstrated in experiments with laboratory animals. In a typical experiment, rat or mouse pups are subjected to early-life stress, such as repeated maternal separation. Their behavior as adults is then examined for evidence of depression, and their genomes are analyzed for epigenetic modifications. Likewise, pregnant rats or mice can be exposed to stress or nutritional deprivation, and their offspring examined for behavioral and epigenetic consequences.

Experiments like these have shown that even animals not directly exposed to traumatic circumstances—those still in the womb when their parents were put under stress—can have blocked receptor genes. It is probably the transmission of glucocorticoids from mother to fetus via the placenta that alters the fetus in this way. In humans, prenatal stress affects each stage of the child’s maturation: for the fetus, a greater risk of preterm delivery, decreased birth weight, and miscarriage; in infancy, problems of temperament, attention, and mental development; in childhood, hyperactivity and emotional problems; and in adulthood, illnesses such as schizophrenia and depression.

What is the significance of these findings? Until the mid-1970s, no one suspected that the way in which the DNA was “read” could be altered by environmental factors, or that the nervous systems of people who grew up in stress-free environments would develop differently from those of people who did not. One’s development, it was thought, was guided only by one’s genetic makeup. As a result of epigenesis, a child deprived of nourishment may continue to crave and consume large amounts of food as an adult, even when he or she is being properly nourished, leading to obesity and diabetes. A child who loses a parent or is neglected or abused may have a genetic basis for experiencing anxiety and depression and possibly schizophrenia. Formerly, it had been widely believed that Darwinian evolutionary mechanisms—variation and natural selection—were the only means for introducing such long-lasting changes in brain function, a process that took place over generations. We now know that epigenetic mechanisms can do so as well, within the lifetime of a single person.

Advertisement

It is by now well established that people who suffer trauma directly during childhood or who experience their mother’s trauma indirectly as a fetus may have epigenetically based illnesses as adults. More controversial is whether epigenetic changes can be passed on from parent to child. Methyl marks are stable when DNA is not replicating, but when it replicates, the methyl marks must be introduced into the newly replicated DNA strands to be preserved in the new cells. Researchers agree that this takes place when cells of the body divide, a process called mitosis, but it is not yet fully established under which circumstances marks are preserved when cell division yields sperm and egg—a process called meiosis—or when mitotic divisions of the fertilized egg form the embryo. Transmission at these two latter steps would be necessary for epigenetic changes to be transmitted in full across generations.

The most revealing instances for studies of intergenerational transmission have been natural disasters, famines, and atrocities of war, during which large groups have undergone trauma at the same time. These studies have shown that when women are exposed to stress in the early stages of pregnancy, they give birth to children whose stress-response systems malfunction. Among the most widely studied of such traumatic events is the Dutch Hunger Winter. In 1944 the Germans prevented any food from entering the parts of Holland that were still occupied. The Dutch resorted to eating tulip bulbs to overcome their stomach pains. Women who were pregnant during this period, Carey notes, gave birth to a higher proportion of obese and schizophrenic children than one would normally expect. These children also exhibited epigenetic changes not observed in similar children, such as siblings, who had not experienced famine at the prenatal stage.

During the Great Chinese Famine (1958–1961), millions of people died, and children born to young women who experienced the famine were more likely to become schizophrenic, to have impaired cognitive function, and to suffer from diabetes and hypertension as adults. Similar studies of the 1932–1933 Ukrainian famine, in which many millions died, revealed an elevated risk of type II diabetes in people who were in the prenatal stage of development at the time. Although prenatal and early-childhood stress both induce epigenetic effects and adult illnesses, it is not known if the mechanism is the same in both cases.

Whether epigenetic effects of stress can be transmitted over generations needs more research, both in humans and in laboratory animals. But recent comprehensive studies by several groups using advanced genetic techniques have indicated that epigenetic modifications are not restricted to the glucocorticoid receptor gene. They are much more extensive than had been realized, and their consequences for our development, health, and behavior may also be great.

It is as though nature employs epigenesis to make long-lasting adjustments to an individual’s genetic program to suit his or her personal circumstances, much as in Lamarck’s notion of “striving for perfection.” In this view, the ill health arising from famine or other forms of chronic, extreme stress would constitute an epigenetic miscalculation on the part of the nervous system. Because the brain prepares us for adult adversity that matches the level of stress we suffer in early life, psychological disease and ill health persist even when we move to an environment with a lower stress level.

Once we recognize that there is an epigenetic basis for diseases caused by famine, economic deprivation, war-related trauma, and other forms of stress, it might be possible to treat some of them by reversing those epigenetic changes. “When we understand that the source of so many of our society’s problems is exposure to childhood adversity,” Harris writes,

the solutions are as simple as reducing the dose of adversity for kids and enhancing the ability of caregivers to be buffers. From there, we keep working our way up, translating that understanding into the creation of things like more effective educational curricula and the development of blood tests that identify biomarkers for toxic stress—things that will lead to a wide range of solutions and innovations, reducing harm bit by bit, and then leap by leap.

Epigenetics has also made clear that the stress caused by war, prejudice, poverty, and other forms of childhood adversity may have consequences both for the persons affected and for their future—unborn—children, not only for social and economic reasons but also for biological ones.