In early 2015 Alex Riley, then twenty-four years old, was working as a researcher at the Natural History Museum in London, using CT-scanning technology to study the teeth and skeletons of sharks and rays. A dinosaur fan since childhood, he was thrilled to be studying both living and extinct species in such detail, and he spent his lunch breaks wandering through the museum, contemplating the remains of giant sloths and prehistoric marine reptiles. But Riley felt unmoored: his parents had recently separated after three decades of marriage, and he had left a Ph.D. program in evolutionary development for a less certain professional path.

When Riley and his museum colleagues published a scientific paper, he didn’t feel worthy of being credited as an author, even though he had worked hard on it; instead, he believed he was a failure. He took a leave of absence that became permanent, and toward the end of the year, he sought treatment for depression. He joined a weekly cognitive behavioral therapy (CBT) group at a local psychiatric hospital, during which a therapist led participants through exercises designed to identify and disrupt their negative thoughts. While Riley found the sessions helpful at times, they brought him no sustained relief, and after four weeks he decided to begin taking citalopram, an antidepressant prescribed by his doctor. The initial side effects made him feel as if he had the flu, and even after his body became accustomed to the drug, he suffered from persistent nausea. When he realized that his depression was worsening, he stopped taking the medication. The withdrawal symptoms left him rocking on the floor of his bedroom with his head in his hands.

“One of the most intolerable aspects of depression,” Daphne Merkin observes in her memoir This Close to Happy (2017), “is the way it insinuates itself everywhere in your life, casting a pall not only over the present but the past and the future as well, suggesting nothing but its own inevitability.” Its general shape and form have been familiar to students of the mind and body for millennia: a condition well beyond sadness, marked by sluggishness, overwhelming feelings of guilt and self-loathing, and, in severe cases, suicidal thoughts and behavior. But as Riley emphasizes in A Cure for Darkness: The Story of Depression and How We Treat It, individual experiences of depression are as diverse as the animal kingdom he once studied.

“Depression is a product of upbringing, trauma, financial uncertainty, loneliness, social bonds, diet, behavior, sedentary lifestyles, neurotransmitters, and genetics that cannot be encapsulated in a word,” he writes.

It can be mild or severe, recurrent or unremitting. It can emerge once and never appear again or it can cast a dark shadow throughout adulthood. Some people sleep too much, others suffer from insomnia. Some eat too much while others shrink toward starvation. Depression can emerge alongside cancer, heart disease, diabetes, and dementia and can make these diseases more lethal; it is a catalyst of mortality.

It is also possible to be depressed and not know it: until a postpartum crisis forced me to find treatment, I assumed that the baseless loneliness and self-contempt that had hounded me since adolescence were universal features of the human experience.

In the summer of 2017, Riley began to investigate the history of his condition. He wanted to understand how CBT and antidepressant medications had become standard treatments for depression in much of the world, and whether there were alternatives that might prove more effective for him. As he soon discovered, the meaning of “treating” or “curing” depression varies widely by patient, practitioner, and era. How many symptoms can be banished and for how long? Which side effects are tolerable and which are worse than the disorder? It is not uncommon for sufferers to cycle through different treatments and medications for years, enduring unpredictable physical and emotional reactions and repeated disappointment. For Riley, as for me and countless others, depression is a condition that can be fended off but never left behind.

Donald Antrim, in his recent memoir One Friday in April, takes exception to the term “depression” altogether, arguing—as William Styron did in Darkness Visible (1989)—that it is inadequate to describe the condition he lives with. “A depression is a concavity, a sloping downward and a return,” Antrim writes, whereas his own experience is one of extended sickness, only temporarily relieved by periods of recovery. “When telling the story of my illness, I try not to speak about depression. I prefer to call it suicide.”1

There is no known cure for depression; Riley’s title is ironic. In his book he surveys the development of medical treatments for the condition in the twentieth and early twenty-first centuries, tracking how each approach was conceived, deployed, and—in most cases—discarded. His perspective is that of a patient and a journalist, not a medical expert, but his experience as a researcher makes him alert to the human side of science and skeptical of its fads. He is particularly interested in reducing the stigma that accompanies many treatments even today, including talk therapy, antidepressants, and electroconvulsive therapy (ECT)—and in efforts to expand the reach of treatments to people without access to mental health care.

Advertisement

Riley begins his story in the late nineteenth century, with the philosophical and professional cleavage that has defined the treatment of depression ever since. On one side was Sigmund Freud, who saw depression as the result of childhood trauma and maintained that it could be remedied only through psychoanalysis. On the other was Emil Kraepelin, the great classifier of mental disorders, who saw depression as a primarily physical ailment, to be treated with medical intervention. Freud and Kraepelin never met, Riley tells us, but they were born just three months apart, in 1856, and both began their scientific careers as anatomists.

In 1885, while Freud was studying in Paris with Jean-Martin Charcot, a physician who used hypnosis to treat his mentally disturbed patients, he developed an interest in the unconscious mind and became convinced that it governed much of human behavior. His theories soon dominated the nascent field of psychoanalysis. Karl Abraham, an admirer and frequent correspondent of Freud’s, developed the first psychoanalytic theory of depression in 1911, attributing the condition to a broken maternal bond and a subsequent hostility toward humanity, including the self. Both men argued that psychoanalysis could relieve depression by identifying a patient’s early, often forgotten tragedies and connecting them with his or her current state; Riley describes them as “paleontologists of the mind, digging through the unconscious strata of their depressed patients, hoping to uncover the old bones of a pathological monster.” Freud’s 1917 essay “Mourning and Melancholia,” which characterizes depression as a response to personal loss, is heavily indebted to Abraham’s ideas.

Kraepelin, meanwhile, came to believe that the symptoms of mental disorders, depression included, could be traced to anatomical aberrations in the brain. Instead of plumbing the unconscious for causes, Kraepelin began, as a professor at the University of Heidelberg in the 1890s, to keep detailed records of individual patients’ symptoms, tracking their conditions over time. In his thousands of patient files, he discerned two broad categories: “dementia praecox,” later known as schizophrenia, and “manic-depressive insanity,” which has since been subdivided into depression, mania, and bipolar disorder. (The distinction he drew between psychotic and mood disorders endures to this day.) While Kraepelin saw that patients could spontaneously recover from depression and other mood disorders, he maintained that, like psychotic disorders, they were primarily biological—not psychological—and as such would not be curable until major advances in microscopes and other technologies allowed researchers to study the physical brain.

And so the line was drawn between biological and psychological approaches to treatment—a division that still affects the way depression is treated. Psychoanalysis, Kraepelin thought, was shamefully unscientific, guilty of “the representation of arbitrary assumptions and conjectures as scientific facts” and “generalization beyond measure from single observations.” Freud, for his part, saw Kraepelin as the leader of an enemy faction and denounced his pessimism about prospective cures, accusing him of offering patients “a condemnation instead of an explanation.”

Among the dangers faced by those living with depression are specialists who fail to recognize the complexity of the condition. As Riley’s book makes clear, the generations of practitioners after Freud and Kraepelin included many who dogmatically swore allegiance to a single approach—whether biological or psychological—and refused to consider that different patients might benefit from different remedies. Desperate patients and families sometimes agreed to invasive treatments that not only failed to relieve depression but debilitated or even killed the sufferer.

In the early twentieth century, adherents of the biological theory of depression tried treating patients with nitrous oxide, opium, testosterone, X-rays, and even tooth extraction, all without success. Then, encouraged by reports of individuals who displayed dramatic changes in temperament and behavior after their frontal lobes were damaged in accidents or by surgery, some practitioners began trying to physically excise depression and other mental afflictions from the brain. In 1935 the Portuguese neurologist Egas Moniz supervised the first prefrontal leucotomy, an operation that cut the connection between the frontal lobes and the rest of the brain. Riley writes that while one early examiner reported that leucotomy patients “were severely ‘diminished’ and had exhibited a ‘degradation of personality’ after the surgery,” Moniz claimed that it was a successful “clinical cure” and had alleviated symptoms of depression and other disorders in more than half of his initial patients.

Walter Freeman, an American psychiatrist, took Moniz’s methods several steps further by pioneering the frontal lobotomy, which not only severed the frontal lobes from the rest of the brain but cut out parts of the lobes themselves. Many of his fellow psychiatrists objected to the practice, but the idea that a malfunctioning brain could be fixed as easily as an automobile was popular. “The brain has ceased to be sacred,” The Saturday Evening Post announced in a celebratory article in 1941. The same year, Freeman and a collaborator performed a lobotomy on twenty-three-year-old Rosemary Kennedy, John F. Kennedy’s younger sister, after suggesting to her family that the operation could relieve her intense mood swings. (Not until the 1970s did informed-consent laws and regulations begin to give patients control of their own treatment.) The botched operation reduced her intellectual capacity to that of a two-year-old, and she was institutionalized for the rest of her life.

Advertisement

Undeterred by such tragedies, Freeman introduced a method that he insisted needed no specialized training to perform: he hammered a metal ice pick into the brain cavity through the back of an eye socket, then used it to crudely brutalize the brain tissue. “Why not use a shotgun? It would be quicker!” an outraged colleague wrote to Freeman. Of the estimated 50,000 patients who were lobotomized in the US between 1949 and 1952, about 20 percent were subjected to these transorbital or “icepick” lobotomies. While reliable data are scarce, a British study of 10,000 standard lobotomies performed between 1943 and 1954 found that 6 percent had killed the patient.

Lobotomies continued for as long as they did not only because of Freeman’s zeal but because the operation, in some cases, delivered a dismal kind of relief. The British study, published in 1961, found that 70 percent of patients reported some improvement, and 18 percent no longer required institutionalization. Though the patients who survived lobotomies were profoundly altered, those who had been tortured by delusions or prone to violence before the operation often emerged calmer and more compliant, sometimes enough so that they could live at home with their families. Until other treatments emerged, the lobotomy could look like the best of bad options.

Electroconvulsive therapy was one of the treatments that replaced lobotomies. As Riley notes, philosophers and physicians had observed since at least the 1700s that epileptic seizures seemed to have a therapeutic effect on mentally ill patients. In 1937 the Italian psychiatrist Ugo Cerletti learned that electrical shocks could induce convulsions in pigs, and he boldly applied this method to humans, finding that after induced seizures, patients reported shorter or less frequent depressive episodes. Cerletti’s ECT encountered resistance when it reached the United States, not only because it reminded people of the electric chair but because its side effects at the time included memory loss and broken bones resulting from powerful seizures. Yet it appeared to be remarkably effective in treating certain forms of depression, particularly those that included delusions or psychotic episodes. Between the mid-1940s and 1960s it was a staple of psychiatric treatment.

Today ECT is considered by many specialists to be among the safest and most effective treatments for severe forms of depression. It now employs briefer, more targeted electrical pulses and can succeed in cases where antidepressants do not. Antrim, who experienced significant relief after first undergoing ECT in the 2000s, writes that despite serious initial doubts, he came to see it as “a powerful measure against suicide”:

After ECT, the feeling in my body of immense weight went away; I felt a kind of physical lightness…. After months of waiting to get well, I regained my sense of time passing. These days, I think of ECT as clean power, good electricity added to a wet, saline medium in which electrical signaling has become chaotic and mistimed.

When ECT was falling out of favor in the 1960s, the first generation of antidepressants was on the rise. Pharmaceutical researchers, noting that tranquilizers decreased levels of neurotransmitters such as serotonin and norepinephrine, surmised that an increase in the activity of these molecules would reverse the effects of depression. Monoamine oxidase (MAO) inhibitors, which prevent the breakdown of serotonin and norepinephrine and thereby increase their levels in the body, became available in the late 1950s; tricyclic antidepressants also debuted in the late 1950s, followed by selective serotonin reuptake inhibitors (SSRIs) in the late 1980s.

Antidepressants have benefited from a succession of evangelists, the first of whom was Nathan Kline, the psychiatrist who wrote From Sad to Glad (1974)—at least one edition of which included the words “Depression: You Can Conquer It Without Analysis!” on the cover. Kline energetically popularized the theory that depression was caused by a chemical imbalance in the brain. As Riley points out, while this idea “isn’t categorically wrong, it is still a far cry from being right.” While all three classes of drugs can change brain chemistry within hours, researchers don’t understand why antidepressants usually take weeks to begin relieving symptoms. And though an increase in serotonin and norepinephrine does frequently lead to a decrease in depression symptoms, it doesn’t necessarily follow that serotonin scarcity is a root cause of the disorder.

Riley moves quickly through the long public and professional debates over the widespread use of antidepressants and their short- and long-term effects, noting that studies of antidepressant effectiveness have often yielded contradictory or inconclusive results. Yet as controversial and imperfect as these medications are, they remain among the best tools we have. Riley, who after experimentation found that the SSRI sertraline lifted his depression, describes himself as “overwhelmed with relief” by the existence of antidepressants. Still, he reflects, our dominant explanation for depression is in some ways simply a modification of that proposed by Hippocrates in the fifth century BCE. Hippocrates blamed depression on an excess of black bile; we blame it on a shortage of neurotransmitters.

Like many people with depression, Riley has also been willing to engage in other forms of low-risk experimentation. During his reporting on current research he adopts a Mediterranean diet and a moderate jogging routine. Both seem to reduce his symptoms, and though he acknowledges that the causal connections are unproven, he comes to think of both as “antidepressants that I prescribe.” He also explores the therapeutic potential of psychedelics, and while his own experiment with psylocibin mushrooms is uneventful, he notes that many present-day psychedelic therapies combine elements drawn from the past century of both psychological and biological treatments, a marriage of chemistry and counseling that creates another opportunity, he writes, to “push the two fields of psychiatry into closer union.”

As for that long-running division between biological and psychological treatments of depression, Riley notes that the successors of Freud and Abraham—in their work with patients using psychoanalysis to excavate the early experiences thought to be the primary cause of mental suffering—didn’t necessarily oppose medical interventions for depression, but they typically resorted to them only after the long process of analysis failed. Even today there is sometimes a bias against pharmacological treatments—a perpetuation of an old stigma that can get in the way of what could be a more collaborative approach.

In tracing the many forms of talk therapy that grew out of psychoanalysis, Riley discusses the psychiatrist Aaron Beck, who in the late 1950s proposed a new theory: depression was a product not, as Freud had thought, of “anger turned inward,” but rather of a negative view of one’s current circumstances. Beck’s “cognitive” approach to therapy proposed to help patients recast their perceptions, using one-on-one or group sessions with therapists to guide them toward more forgiving assessments of themselves and others.

This method was initially challenged both by psychiatrists who preferred to rely on their growing pharmacopeia and by behavioral psychologists such as Joseph Wolpe and B.F. Skinner, who believed that human behavior was essentially a collection of learned reactions to external stimuli. Despite this early competition, the cognitive and behavioral schools of psychology eventually merged to produce CBT, which is now so common that it is what most people mean by “therapy.” (Beck, who died last November at the age of one hundred, once joked that the lead character of The Sopranos, the mob boss and sometime psychoanalysis patient Tony Soprano, could have been cured of his panic attacks with two sessions of CBT.)

Among the people Riley interviews is Myrna Weissman, a professor of psychiatry and epidemiology at Columbia University, who with Gerald Klerman in the 1960s and 1970s pioneered the form of talk therapy known as interpersonal psychotherapy. Similar to cognitive therapy, it considers the quality of patients’ social relationships as well as their perceptions of themselves and others. “We were friends,” she says of Beck, whose photograph is displayed in her office. “We were not competing.” Her expression of collegiality feels like a radical departure from a hundred years of internecine rivalries.

“There shouldn’t be one psychotherapy,” Weissman tells Riley. “I think that people who say, ‘This is the psychotherapy,’ are doing a disservice to patients. It’s like saying there should be only Prozac.”

In the 1980s Weissman realized that while psychiatrists had often assumed that depression was a first-world problem—a side effect of modern urban life—there were little if any data on how common it was worldwide, or who was most likely to suffer from it. She and other researchers soon confirmed that depression was at least as common in poorer communities and nations, and that in some countries Western investigators hadn’t recognized it simply because it traveled under different names. Riley’s discussion of this research and its consequences is one of the most moving sections of his book.

Speakers of Luganda, the most common indigenous language in Uganda, don’t have a word for “depression.” They use the terms yo’kwekyawa and okwekubazida, which roughly translate as “self-loathing” and “self-pity” and describe two distinct conditions; the former, which can include thoughts of suicide, is considered more severe. In Zimbabwe in the 1990s, researchers learned that the local Shona language had one word for everyday sadness (suwa) and another for a persistent, ruminative state that fit the clinical description of depression. This term, kufungisisa, which literally translates to “thinking too much,” unlocked communication between practitioners and patients.

Two adults and a child on friendship bench, Masvingo, Zimbabwe

Brent Stirton/Getty Images

A friendship bench, Masvingo, Zimbabwe, January 2020

In the early 2000s the Zimbabwean psychiatrist Dixon Chibanda recognized that his rural patients, many of whom were severely stressed by poverty and the multifold impacts of the HIV epidemic, were dying from a lack of mental health care. He recruited a corps of rural community members, predominantly grandmothers, and trained them to conduct informal therapy sessions with their neighbors on open-air “friendship benches.” In clinical trials, the grandmothers and their benches proved to be so successful in relieving the most incapacitating symptoms of depression that the approach has since spread to Kenya, Botswana, the Caribbean, New York City, and elsewhere.

In the final section of his book Riley looks at emerging and reemerging treatments for depression, many of which combine biological and psychological strategies. He discusses the work of the neurologist Helen Mayberg, whose studies in the early 2000s revealed that brain activity patterns can help predict whether depression sufferers are most likely to respond to talk therapy, antidepressants, or more aggressive treatments such as ECT. He also speaks with researchers who, troubled by psychiatry’s preoccupation with low serotonin, are exploring the role of the microbiome and the immune system in both causing and relieving depressive episodes. (So far there is no conclusive evidence that the microbiome affects brain chemistry.) “Over the last decade,” Riley writes, “there has been a trend in psychiatry to do away with diagnoses” altogether. Will we, he wonders, continue to use the word “depression”?

A Cure for Darkness, with its combination of memoir, history, and reportage, calls to mind Andrew Solomon’s The Noonday Demon: An Atlas of Depression (2001). In that sprawling, wise, and empathetic book, which includes interviews with doctors, scientists, policymakers, and others, Solomon powerfully documents the individual experiences of an expansive range of people suffering from depression, including himself. These accounts are revealing and validating, showing both the isolation felt by people with depression and the global pervasiveness of their condition.

As Solomon puts it, depression is not sadness but “the aloneness within us made manifest.” His mordant comment to a friend as he started to descend into a breakdown, “I’m afraid of lamb chops again,” is still my warning to family members when I sense my own depression coming on. Like Solomon, Riley has responded to his depression by turning outward, seeking to place his experience and its purported remedies alongside the experiences of fellow sufferers past and present. This departure from the interior narrative typical of depression memoirs, it seems to me, has therapeutic value as well as formal significance. One of the great strengths of Riley’s book is that, like Solomon’s, it refutes the perception of aloneness that is perhaps the most distinguishing characteristic of depression, and among its most terrifying.

Depression remains a condition that can be treated but not cured—and its toll is increasing. According to Riley’s statistics, an estimated 322 million people worldwide live with depression today; of those with untreated depression, roughly 15 percent die by suicide. An analysis published in The Lancet last October shows that the pandemic has led to a nearly 30 percent jump in the prevalence of major depression. Two decades after the first edition of Solomon’s book, and seven years since its last update,2 Riley cannot report a cure for depression, but he does show that there have been some modest advances in how we think about and treat it—including an increased awareness of the value of collaborative approaches. “We can’t kill depression,” Riley writes. “But, with treatment, we can stop it from killing us.”