• Email
  • Single Page
  • Print

Mind Control & the Internet

halpern_2-062311.jpg
Edward Gorey Charitable Trust

Why this matters is captured in a study in the spring issue of Sociological Quarterly, which echoes Pariser’s concern that when ideology drives the dissemination of information, knowledge is compromised. The study, which examined attitudes toward global warming among Republicans and Democrats in the years between 2001 and 2010, found that in those nine years, as the scientific consensus on climate change coalesced and became nearly universal, the percentage of Republicans who said that the planet was beginning to warm dropped precipitously, from 49 percent to 29 percent. For Democrats, the percentage went up, from 60 percent to 70 percent. It was as if the groups were getting different messages about the science, and most likely they were. The consequence, as the study’s authors point out, was to stymie any real debate on public policy. This is Pariser’s point exactly, and his concern: that by having our own ideas bounce back at us, we inadvertently indoctrinate ourselves with our own ideas. “Democracy requires citizens to see things from one another’s point of view, but instead we’re more and more enclosed in our own bubbles,” he writes. “Democracy requires a reliance on shared facts; instead we’re being offered parallel but separate universes.”

It’s not difficult to see where this could lead—how easily anything with an agenda (a lobbying group, a political party, a corporation, a government) could flood the echo chamber with information central to its cause. (This, in fact, is what has happened, on the right, with climate change.) Who would know? Certainly not Michael Chorost, whose blind allegiance to Google—which he believes is the central part of the “nascent forebrain, hippocampus, and long-term declarative memory store” of the coming World Wide Mind—is matched by his stunning political naiveté. A government “that used the World Wide Mind for overt control would have to be more ominously totalitarian than any government in existence today (except perhaps North Korea),” he writes. “The push-pull dynamic of evolution tends to weed out totalitarian societies because they are, in the long run, inefficient and wasteful.” Contrast this to the words of the man who invented the World Wide Web, Sir Timothy Berners-Lee, writing not long ago in Scientific American:

The Web as we know it is being threatened…. Some of its most successful inhabitants have begun to chip away at its principles…. Governments—totalitarian and democratic alike—are monitoring people’s online habits, endangering important human rights.

One of the most significant changes in the Internet since the release in 1993 of the first graphical browser, Mosaic, which was built on the basis of Berners-Lee’s work, has been the quest to monetize it. In its inaugural days, the Web was a strange, eclectic collection of personal homepages, a kind of digital wall art that bypassed traditional gatekeepers, did not rely on mainstream media companies or corporate cash, and was not driven by commercial interests. The computer scientist and musician Jaron Lanier was there at the creation, and in his fierce, coruscating manifesto, You Are Not a Gadget,3 remembers it like this:

The rise of the web was a rare instance when we learned new, positive information about human potential. Who would have guessed (at least at first) that millions of people would put so much effort into a project without the presence of advertising, commercial motive, threat of punishment, charismatic figures, identity politics, exploitation of the fear of death, or any of the other classic motivators of mankind. In vast numbers, people did something cooperatively, solely because it was a good idea, and it was beautiful.

But then commerce moved in, almost by accident, when Larry Page and Sergey Brin, the duo who started Google, reluctantly paired small ads with their masterful search engine as a way to fund it. It was not their intent, at first, to create the largest global advertising platform in the history of the world, or to move marketing strategy away from pushing products toward consumers to pulling individual consumers toward specific products and brands. But that is what happened. Write the word “blender” in an e-mail, and the next set of ads you’re likely to see will be for Waring and Oster.4 Search for information on bipolar disease, and drug ads will pop up when you’re reading baseball scores. Use Google Translate to read an abstract of a journal article and an ad for Spanish translation software will appear when you are using an online English dictionary. (All this activity leads to a question that will not be rhetorical if Chorost’s World Wide Mind comes to fruition: Will our thoughts have corporate sponsors, too?)

Targeted ads (even when they are generated by what may have appeared to have been a private communication) may seem harmless enough—after all, if there is going to be advertising, isn’t it better if it is for products and services that might be useful? But to pull you into a transaction, companies believe they need to know not only your current interests, but what you have liked before, how old you are, your gender, where you live, how much education you have, and on and on. There are something like five hundred companies that are able to track every move you make on the Internet, mining the raw material of the Web and selling it to marketers. (“Stop calling yourself a user,” Lanier warns. “You are being used.”) That you are overweight, have diabetes, have missed a car payment or two, read historical novels, support Republicans, use a cordless power drill, shop at Costco, and spend a lot of time on airplanes is not only known to people other than yourself, it is of great monetary value to them as well. So, too, where you are and where you’ve been, as we recently learned when it was revealed that both Apple and Google have been tracking mobile phone and tablet users and storing that information as well.

Even reading devices like Amazon’s Kindle pay attention to what users are doing: highlight a passage in a Kindle book and the passage is sent back to Amazon. Clearly, the potential for privacy and other civil liberty abuses here is vast. While the FBI, for instance, needs a warrant to search your computer, Pariser writes that “if you use Yahoo or Gmail or Hotmail for your e-mail, you ‘lose your constitutional protections immediately,’ according to a lawyer for the Electronic Frontier Foundation.” At least one arrest has been made by law enforcement officers using Apple location data. And this past April, the Supreme Court heard arguments in Sorrell v. IMS Health, in which IMS Health, in challenging Vermont’s statutory restriction on the sale of patients’ prescription information to data-mining companies, argued that harvesting and selling medical records data is a First Amendment right. Clearly, data tracking and mining give new meaning to the words “computer monitor.”

In the commercial sphere, marketers are also looking beyond facts and bits of information, in order to determine not just what you have bought, but what kinds of pitches appealed to you when you did. Once they have compiled your “persuasion profile,” they will refine those targeted ads even further. And if marketing companies can do this, why not political candidates, the government, or companies that want to sway public opinion? “There are undoubtedly times and places and styles of argument that make us more susceptible to believe what we’re told,” Pariser observes.

One thing that we—the denizens of the Internet—have come to accept without much thought is that commerce is a really cool aspect of the Web’s shift into social networking. The very popular Foursquare, Loopt, and Groupon sites, for example, make shopping and branding the basis of the social encounter. People on Foursquare vie to become the “mayor” of bakeries and clothing stores by visiting them more than anyone else. They proudly display “badges” that they’ve “earned” by patronizing certain businesses, as if they were trophies celebrating excellence. Facebook users who click on the “like” button for a product may trigger the appearance of an ad for that product on the pages of their “friends.” Companies like Twitalyzer and Klout analyze data from Twitter, Facebook, and LinkedIn to determine who has the most influence online—these can be celebrities or ordinary people with significant followings—and sell that information to businesses that then entice the influencers to pitch their products or “evangelize their brand.” This, according to The Wall Street Journal, has “ignited a race among social-media junkies who, eager for perks and bragging rights, are working hard to game the system and boost their scores.”5 As Lanier points out, “The only hope for social networking sites from a business point of view is for a magic formula to appear in which some method of violating privacy and dignity becomes acceptable.” That magic, it seems, is already in play.

The paradox of personalization and the self-expression promoted by the Internet through Twitter, Facebook, and even Chatroulette is that it simultaneously diminishes the value of personhood and individuality. Read the comments that accompany many blog posts and articles, and it is overwhelmingly evident that violating dignity—someone else’s and, therefore, one’s own—is a cheap and widely circulated currency. This is not only true for subjects that might ordinarily incite partisanship and passion, like sports or politics, but for pretty much anything.6

The point of ad hominem attacks is to take a swipe at someone’s character, to undermine their integrity. Chorost suggests that the reason the Internet as we now know it does not foster the kind of empathy he sees coming in the Web of the future, when we will “feel people’s inner lives electronically,” is because it is not yet an integral part of our bodies, but Lanier’s explanation is more convincing. The “hive mind” created through our electronic connections necessarily obviates the individual—indeed, that’s what makes it a collective consciousness. Anonymity, which flourishes where there is no individual accountability, is one of its key features, and behind it, meanness, antipathy, and cruelty have a tendency to rush right in. As the sociologist Sherry Turkle observes:

Networked, we are together, but so lessened are our expectations of each other that we can feel utterly alone. And there is the risk that we come to see others as objects to be accessed—and only for the parts that we find useful, comforting, or amusing.7

Here is Chorost describing the wonders of a neural-networked friendship:

Having brainlike computers would greatly simplify the process of extracting information from one brain and sending it to another. Suppose you have such a computer, and you’re connected with another person via the World Wide Mind…. You see a cat on the sidewalk in front of you. Your rig…sees activity in a large percentage of the neurons constituting your brain’s invariant representation of a cat. To let your friend know you’re seeing a cat, it sends three letters of information—CAT—to the other person’s implanted rig. That person’s rig activates her brain’s invariant representation of a cat, and she sees it. Or rather, to be more accurate, she sees a memory of a cat that is taken from her own neural circuitry….
  1. 3

    See also Zadie Smith’s discussion of Lanier’s book in these pages, “Generation Why?,” November 25, 2010. 

  2. 4

    This is not a rhetorical example, as the following exchange on the website Garden Web from last February illustrates. A little over an hour after a woman writes about her thirty-five-year-old Oster blender, she posts on the site again. This time, instead of the subject line being “Re: Blenders,” it is “Advertisements.” “And then, like magic,” the woman, who calls herself annie1992, writes, “look what appears at the top of my screen….” It is, not so magically, an ad for a blender. 

  3. 5

    See Jessica E. Vascellaro, “Wannabe Cool Kids Aim to Game the Web’s New Social Scorekeepers,” February 8, 2011. 

  4. 6

    Note the comments to this New York Times piece where the author, a doctor, mistakenly gave her dog ibuprofen and was writing about her mistake to warn others: Randi Hutter Epstein, “How the Doctor Almost Killed Her Dog,” New York Times Well blog, January 20, 2011. 

  5. 7

    See Sherry Turkle, Alone Together: Why We Expect More from Technology and Less from Each Other (Basic Books, 2011), p. 154. 

  • Email
  • Single Page
  • Print