sacks_1-022113.jpg

Private Collection/Peter Ertl/Albertina, Vienna

Heinrich Kühn: Hans with Bureau, 1905; from Heinrich Kühn: The Perfect Photograph, the catalog of a recent exhibition organized by the Albertina, Vienna. Now out of print, it was edited by Monika Faber and Astrid Mahler and published by Hatje Cantz.

In 1993, approaching my sixtieth birthday, I started to experience a curious phenomenon—the spontaneous, unsolicited rising of early memories into my mind, memories that had lain dormant for upward of fifty years. Not merely memories, but frames of mind, thoughts, atmospheres, and passions associated with them—memories, especially, of my boyhood in London before World War II. Moved by these, I wrote two short memoirs, one about the grand science museums in South Kensington, which were so much more important than school to me when I was growing up; the other about Humphry Davy, an early-nineteenth-century chemist who had been a hero of mine in those far-off days, and whose vividly described experiments excited me and inspired me to emulation. I think a more general autobiographical impulse was stimulated, rather than sated, by these brief writings, and late in 1997, I launched on a three-year project of writing a memoir of my boyhood, which I published in 2001 as Uncle Tungsten.1

I expected some deficiencies of memory—partly because the events I was writing about had occurred fifty or more years earlier, and most of those who might have shared their memories, or checked my facts, were now dead; and partly because, in writing about the first fifteen years of my life, I could not call on the letters and notebooks that I started to keep, assiduously, from the age of eighteen or so.

I accepted that I must have forgotten or lost a great deal, but assumed that the memories I did have—especially those that were very vivid, concrete, and circumstantial—were essentially valid and reliable; and it was a shock to me when I found that some of them were not.

A striking example of this, the first that came to my notice, arose in relation to the two bomb incidents that I described in Uncle Tungsten, both of which occurred in the winter of 1940–1941, when London was bombarded in the Blitz:

One night, a thousand-pound bomb fell into the garden next to ours, but fortunately it failed to explode. All of us, the entire street, it seemed, crept away that night (my family to a cousin’s flat)—many of us in our pajamas—walking as softly as we could (might vibration set the thing off?). The streets were pitch dark, for the blackout was in force, and we all carried electric torches dimmed with red crêpe paper. We had no idea if our houses would still be standing in the morning.

On another occasion, an incendiary bomb, a thermite bomb, fell behind our house and burned with a terrible, white-hot heat. My father had a stirrup pump, and my brothers carried pails of water to him, but water seemed useless against this infernal fire—indeed, made it burn even more furiously. There was a vicious hissing and sputtering when the water hit the white-hot metal, and meanwhile the bomb was melting its own casing and throwing blobs and jets of molten metal in all directions.

A few months after the book was published, I spoke of these bombing incidents to my brother Michael. Michael is five years my senior, and had been with me at Braefield, the boarding school to which we had been evacuated at the beginning of the war (and in which I was to spend four miserable years, beset by bullying schoolmates and a sadistic headmaster). My brother immediately confirmed the first bombing incident, saying, “I remember it exactly as you described it.” But regarding the second bombing, he said, “You never saw it. You weren’t there.”

I was staggered by Michael’s words. How could he dispute a memory I would not hesitate to swear on in a court of law, and had never doubted as real? “What do you mean?” I objected. “I can see the bomb in my mind’s eye now, Pa with his pump, and Marcus and David with their buckets of water. How could I see it so clearly if I wasn’t there?”

“You never saw it,” Michael repeated. “We were both away at Braefield at the time. But David [our older brother] wrote us a letter about it. A very vivid, dramatic letter. You were enthralled by it.” Clearly, I had not only been enthralled, but must have constructed the scene in my mind, from David’s words, and then appropriated it, and taken it for a memory of my own.

After Michael said this, I tried to compare the two memories—the primary one, on which the direct stamp of experience was not in doubt, with the constructed, or secondary, one. With the first incident, I could feel myself into the body of the little boy, shivering in his thin pajamas—it was December, and I was terrified—and because of my shortness compared to the big adults all around me, I had to crane my head upward to see their faces.

Advertisement

The second image, of the thermite bomb, was equally clear, it seemed to me—very vivid, detailed, and concrete. I tried to persuade myself that it had a different quality from the first, that it bore evidence of its appropriation from someone else’s experience, and its translation from verbal description into image. But although I now know, intellectually, that this memory was “false,” it still seems to me as real, as intensely my own, as before. Had it, I wondered, become as real, as personal, as strongly embedded in my psyche (and, presumably, my nervous system) as if it had been a genuine primary memory? Would psychoanalysis, or, for that matter, brain imaging, be able to tell the difference?

My “false” bomb experience was closely akin to the true one, and it could easily have been my own experience too. It was plausible that I might have been there; had it not been so, perhaps the description of it in my brother’s letter would not have affected me so. All of us “transfer” experiences to some extent, and at times we are not sure whether an experience was something we were told or read about, even dreamed about, or something that actually happened to us.

This is especially apt to happen with very early experiences, with one’s so-called “earliest memories.” I have a vivid memory from about the age of two of pulling the tail of our chow, Peter, while he was gnawing a bone under the hall table, of Peter leaping up and biting me in the cheek, and of my being carried, howling, into my father’s surgery in the house, where a couple of stitches were put in my cheek.

There is an objective reality here: I was bitten on the cheek by Peter when I was two, and still bear the scar of this. But do I actually remember it, or was I told about it, subsequently constructing a “memory” that became more and more firmly fixed in my mind by repetition? The memory seems intensely real to me, and the fear associated with it is certainly real, for I developed a fear of large animals after this incident—Peter was almost as large as I was at two—a fear that they would suddenly attack or bite me.

Daniel Schacter has written extensively on distortions of memory and the “source confusions” that go with them, and in his book Searching for Memory recounts a well-known story about Ronald Reagan:

In the 1980 presidential campaign, Ronald Reagan repeatedly told a heartbreaking story of a World War II bomber pilot who ordered his crew to bail out after his plane had been seriously damaged by an enemy hit. His young belly gunner was wounded so seriously that he was unable to evacuate the bomber. Reagan could barely hold back his tears as he uttered the pilot’s heroic response: “Never mind. We’ll ride it down together.” The press soon realized that this story was an almost exact duplicate of a scene in the 1944 film A Wing and a Prayer. Reagan had apparently retained the facts but forgotten their source.

Reagan was a vigorous sixty-nine-year-old at the time, was to be president for eight years, and only developed unmistakable dementia in the 1990s. But he had been given to acting and make-believe throughout his life, and he had displayed a vein of romantic fantasy and histrionism since he was young. Reagan was not simulating emotion when he recounted this story—his story, his reality, as he believed it to be—and had he taken a lie detector test (functional brain imaging had not yet been invented at the time), there would have been none of the telltale reactions that go with conscious falsehood.

It is startling to realize that some of our most cherished memories may never have happened—or may have happened to someone else. I suspect that many of my enthusiasms and impulses, which seem entirely my own, have arisen from others’ suggestions, which have powerfully influenced me, consciously or unconsciously, and then been forgotten. Similarly, while I often give lectures on similar topics, I can never remember, for better or worse, exactly what I said on previous occasions; nor can I bear to look through my earlier notes. Losing conscious memory of what I have said before, and having no text, I discover my themes afresh each time, and they often seem to me brand-new. This type of forgetting may be necessary for a creative or healthy cryptomnesia, one that allows old thoughts to be reassembled, retranscribed, recategorized, given new and fresh implications.

Advertisement

Sometimes these forgettings extend to autoplagiarism, where I find myself reproducing entire phrases or sentences as if new, and this may be compounded, sometimes, by a genuine forgetfulness. Looking back through my old notebooks, I find that many of the thoughts sketched in them are forgotten for years, and then revived and reworked as new. I suspect that such forgettings occur for everyone, and they may be especially common in those who write or paint or compose, for creativity may require such forgettings, in order that one’s memories and ideas can be born again and seen in new contexts and perspectives.

Webster’s defines “plagiarize” as “to steal and pass off (the ideas or words of another) as one’s own: use (another’s production) without crediting the source …to commit literary theft: present as new and original an idea or product derived from an existing source.” There is a considerable overlap between this definition and that of “cryptomnesia.” The essential difference is that plagiarism, as commonly understood and reprobated, is conscious and intentional, whereas cryptomnesia is neither. Perhaps the term “cryptomnesia” needs to be better known, for though one may speak of “unconscious plagiarism,” the very word “plagiarism” is so morally charged, so suggestive of crime and deceit, that it retains a sting even if it is “unconscious.”

In 1970, George Harrison composed an enormously successful song, “My Sweet Lord,” which turned out to have great similarities to a song by Ronald Mack (“He’s So Fine”), recorded eight years earlier. When the matter went to trial, the judge found Harrison guilty of plagiarism, but showed psychological insight and sympathy in his summary of the case. He concluded:

Did Harrison deliberately use the music of “He’s So Fine”? I do not believe he did so deliberately. Nevertheless…this is, under the law, infringement of copyright, and is no less so even though subconsciously accomplished.

sacks_2-022113.jpg

Private Collection

Georges Seurat: Night Stroll, 1887–1888

Helen Keller was accused of plagiarism when she was only twelve.2 Though deaf and blind from an early age, and indeed languageless before she met Annie Sullivan at the age of six, she became a prolific writer once she learned finger spelling and Braille. As a girl, she had written, among other things, a story called “The Frost King,” which she gave to a friend as a birthday gift. When the story found its way into print in a magazine, readers soon realized that it bore great similarities to “The Frost Fairies,” a children’s short story by Margaret Canby. Admiration for Keller now turned into accusation, and Helen was accused of plagiarism and deliberate falsehood, even though she said that she had no recollection of reading Canby’s story, and thought she had made it up herself. The young Helen was subjected to a ruthless inquisition, which left its mark on her for the rest of her life.

But she had defenders, too, including the plagiarized Margaret Canby, who was amazed that a story spelled into Helen’s hand three years before could be remembered or reconstructed by her in such detail. “What a wonderfully active and retentive mind that gifted child must have!” Canby wrote. Alexander Graham Bell came to her defense, saying, “Our most original compositions are composed exclusively of expressions derived from others.”3

Indeed, Keller’s remarkable imagination and mind could not have developed and become as rich as they were without appropriating the language of others. Perhaps in a general sense we are all dependent on the thoughts and images of others.

Keller herself said of such appropriations that they were most apt to occur when books were spelled into her hands, their words passively received. Sometimes when this was done, she said, she could not identify or remember the source, or even, sometimes, whether it came from outside her or not. Such confusion rarely occurred if she read actively, using Braille, moving her finger across the pages.

The question of Coleridge’s plagiarisms, paraphrases, cryptomnesias, or borrowings has intrigued scholars and biographers for nearly two centuries, and is of special interest in view of his prodigious powers of memory, his imaginative genius, and his complex, multiform, sometimes tormented sense of identity. No one has described this more beautifully than Richard Holmes in his two- volume biography.

Coleridge was a voracious, omnivorous reader who seemed to retain all that he read. There are descriptions of him as a student reading The Times in a casual fashion, then being able to reproduce the entire paper, including its advertisements, verbatim. “In the youthful Coleridge,” writes Holmes,

this is really part of his gift: an enormous reading capacity, a retentive memory, a talker’s talent for conjuring and orchestrating other people’s ideas, and the natural instinct of a lecturer and preacher to harvest materials wherever he found them.

Literary borrowing was commonplace in the seventeenth century—Shakespeare borrowed freely from many of his contemporaries, as did Milton.4 Friendly borrowing remained common in the eighteenth century, and Coleridge, Wordsworth, and Southey all borrowed from one another, sometimes even, according to Holmes, publishing work under each other’s names.

But what was common, natural, and playful in Coleridge’s youth gradually took on a more disquieting form, especially in relation to the German philosophers (Friedrich Schelling above all) whom he “discovered,” venerated, translated, and finally came to use in the most extraordinary way. Whole pages of Coleridge’s Biographia Literaria consist of unacknowledged, verbatim passages from Schelling. While this unconcealed and damaging behavior has been readily (and reductively) categorized as “literary kleptomania,” what actually went on is complex and mysterious, as Holmes explores in the second volume of his biography, where he sees the most flagrant of Coleridge’s plagiarisms as occurring at a devastatingly difficult period of his life, when he had been abandoned by Wordsworth, was disabled by profound anxiety and intellectual self-doubt, and more deeply addicted to opium than ever. At this time, Holmes writes, “his German authors gave him support and comfort: in a metaphor he often used himself, he twined round them like ivy round an oak.”

Earlier, as Holmes describes, Coleridge had found another extraordinary affinity, for the German writer Jean-Paul Richter—an affinity that led him to translate and transcribe Richter’s writings, and then to take off from them, elaborating them in his own way and then, in his notebooks, conversing and communing with Richter. At times, the voices of the two men became so intermingled as to be hardly distinguishable from one another.

In 1996, I read a review of a new play, Molly Sweeney, by Brian Friel. It was, I read, about a massage therapist, born blind, who is given sight by an operation in middle life, but then finds this unprecedented ability to see profoundly confusing. Molly is unable to recognize anybody or anything, can make nothing of what she sees—and ultimately, gratefully, returns to her original state of blindness. I was startled by this, because I myself had written and published in The New Yorker, just three years earlier, the case history of a patient with an exceedingly similar story (“To See and Not See”). When I obtained a copy of Friel’s new play, I was not surprised to find it brilliant and original in conception and style, but I was surprised to find, over and above the thematic similarities, entire phrases and sentences from my own case history.

I wrote to Friel, and he responded that he had indeed read my piece, and had been much moved by it (the more so as he had feared he was losing his own vision). He had also read many other case histories of the restoration of vision. Friel concluded that he must have inadvertently used some phrases from my account, but that this was completely unconscious, and agreed to add to Molly Sweeney an acknowledgment of the sources of his inspiration.

Freud was fascinated by the slippages and errors of memory that occur in the course of daily life, and their relation to emotion, especially unconscious emotion; but he was also forced to consider the much grosser distortions of memory that some of his patients showed, especially when they gave him accounts of having been sexually seduced or abused in childhood. He at first took all these accounts literally, but eventually, when there seemed little evidence or plausibility in several cases, he started to wonder whether such recollections had been distorted by fantasy, and whether some, indeed, might be total fabulations, constructed unconsciously, but so convincingly that the patients themselves believed in them absolutely. The stories that patients told, and had told to themselves, could have a very powerful effect on their lives, and it seemed to Freud that their psychological reality might be the same whether they came from actual experience or from fantasy.

In our present age, descriptions and accusations of childhood abuse have reached almost epidemic proportions. Much is made of so-called recovered memories—memories of experiences so traumatic as to be defensively repressed, and then, with therapy, released from repression. Particularly dark and fantastic forms of this include descriptions of satanic rituals of one sort and another, accompanied often by coercive sexual practices. Lives, and families, have been ruined by such accusations. But it has been shown, in at least some cases, that such descriptions can be insinuated or planted by others. The frequent combination, here, of a suggestible witness (often a child) with an authority figure (perhaps a therapist, a teacher, a social worker, or an investigator) can be particularly powerful.

From the Inquisition and the Salem witch trials to the Soviet trials of the 1930s and Abu Ghraib, varieties of “extreme interrogation,” or outright physical and mental torture, have been used to extract political or religious “confessions.” While such interrogation may be intended to extract information in the first place, its deeper intentions may be to brainwash, to effect a genuine change of mind, to fill it with implanted, self-inculpatory memories, and in this it may be frighteningly successful.5

But it may not take coercive suggestion to affect a person’s memories. The testimony of eyewitnesses is notoriously subject to suggestion and to error, frequently with dire effects on the wrongfully accused.6 With the advent of DNA testing, it is now possible to find, in many cases, an objective corroboration or refutation of such testimony, and Schacter notes that “a recent analysis of forty cases in which DNA evidence established the innocence of wrongly imprisoned individuals revealed that thirty-six of them (90 percent) involved mistaken eyewitness identification.”

If the last thirty years have seen a surge or resurgence of ambiguous memory and identity syndromes, they have also led to important research—forensic, theoretical, and experimental—on the malleability of memory. Elizabeth Loftus, the psychologist and memory researcher, has documented a disquieting success in implanting false memories by simply suggesting to a subject that he has experienced a fictitious event. Such pseudo-events, invented by psychologists, may vary from mildly upsetting or comic incidents (that, for example, as a child, one was lost in a mall) to more serious incidents (that one was the victim of a serious animal attack, or a serious assault by another child). After initial skepticism (“I was never lost in a shopping mall”), and then uncertainty, the subject may move to a conviction so profound that he will continue to insist on the truth of the implanted memory, even after the experimenter confesses that it never happened in the first place.

What is clear in all these cases—whether of imagined or real abuse in childhood, of genuine or experimentally implanted memories, of misled witnesses and brainwashed prisoners, of unconscious plagiarism, and of the false memories we probably all have based on misattribution or source confusion—is that, in the absence of outside confirmation, there is no easy way of distinguishing a genuine memory or inspiration, felt as such, from those that have been borrowed or suggested, between what the psychoanalyst Donald Spence calls “historical truth” and “narrative truth.”

Even if the underlying mechanism of a false memory is exposed, as I was able to do, with my brother’s help, in the incendiary bomb incident (or as Loftus would do when she confessed to her subjects that their memories were implanted), this may not alter the sense of actual lived experience or reality that such memories have. Nor, for that matter, may the obvious contradictions or absurdity of certain memories alter the sense of conviction or belief. For the most part the people who claim to be abducted by aliens are not lying when they speak of how they were taken into alien spaceships, any more than they are conscious of having invented a story—some truly believe that this is what happened.

Once such a story or memory is constructed, accompanied by vivid sensory imagery and strong emotion, there may be no inner, psychological way of distinguishing true from false—or any outer, neurological way. The physiological correlates of such memory can be examined using functional brain imaging, and these images show that vivid memories produce widespread activation in the brain involving sensory areas, emotional (limbic) areas, and executive (frontal lobe) areas—a pattern that is virtually identical whether the “memory” is based on experience or not.

There is, it seems, no mechanism in the mind or the brain for ensuring the truth, or at least the veridical character, of our recollections. We have no direct access to historical truth, and what we feel or assert to be true (as Helen Keller was in a very good position to note) depends as much on our imagination as our senses. There is no way by which the events of the world can be directly transmitted or recorded in our brains; they are experienced and constructed in a highly subjective way, which is different in every individual to begin with, and differently reinterpreted or reexperienced whenever they are recollected. (The neuroscientist Gerald M. Edelman often speaks of perceiving as “creating,” and remembering as “recreating” or “recategorizing.”) Frequently, our only truth is narrative truth, the stories we tell each other, and ourselves—the stories we continually recategorize and refine. Such subjectivity is built into the very nature of memory, and follows from its basis and mechanisms in the human brain. The wonder is that aberrations of a gross sort are relatively rare, and that, for the most part, our memories are relatively solid and reliable.

We, as human beings, are landed with memory systems that have fallibilities, frailties, and imperfections—but also great flexibility and creativity. Confusion over sources or indifference to them can be a paradoxical strength: if we could tag the sources of all our knowledge, we would be overwhelmed with often irrelevant information.

Indifference to source allows us to assimilate what we read, what we are told, what others say and think and write and paint, as intensely and richly as if they were primary experiences. It allows us to see and hear with other eyes and ears, to enter into other minds, to assimilate the art and science and religion of the whole culture, to enter into and contribute to the common mind, the general commonwealth of knowledge. This sort of sharing and participation, this communion, would not be possible if all our knowledge, our memories, were tagged and identified, seen as private, exclusively ours. Memory is dialogic and arises not only from direct experience but from the intercourse of many minds.