• Email
  • Single Page
  • Print

Google and Money!

Sergey Brin has often complained about those who fear that Google might start acting as “Big Brother”; he emphasizes that Google allows users to opt out of data-tracking, and that the company has done more than any other site to protect its users from government intrusions, especially in countries like China.1 But even small violations of privacy can trouble users, and if too many want to opt out of Google’s data tracking, the company, faced with a huge mark-up for targeted ads, will have every incentive to stop giving users the option not to participate, at least for free.

Google executives, according to the documents disclosed by the Journal, have already half-seriously considered allowing users to pay Google the amount that advertisers would otherwise offer the company to reach them, in exchange for receiving from Google a spotless, ad-free, premium Internet. The next obvious step would be to provide well-off users with the ability to opt out of all data tracking, and to offer a premium Internet with greater privacy, at a price.

It is at this point that the problems of net neutrality and of privacy might appear to come together. After all, wouldn’t such a two-tiered Internet, with privacy available only to those who can afford it, be the opposite of a “neutral Internet”? By defending net neutrality, wouldn’t we also defend privacy? The answer, unfortunately, is no, and the question goes some way to outlining the limits of net neutrality as a way of protecting our values online.

Net neutrality doesn’t have much meaning beyond the world of commerce. It is a way of using regulation to ensure a free and open market for competing sites; and a free-market Internet, like a free-market culture, can easily lead to a system in which people can pay to join “gated communities” that provide extra privacy protections. The advocates of net neutrality should be thanked for fighting so hard for an abstract, important principle, long before the consequences of its dissolution become apparent. Yet their success emphasizes the comparative neglect of privacy, an equally abstract and easily eroded principle.

Google, unlike Facebook, currently allows users to opt out of much of its data tracking by adjusting the options on their user profiles. The easiest way to protect privacy online might be a regulation that required all sites to provide this feature. Such an option, in the form of a “Do Not Track” list—modeled on the highly successful “Do Not Call” list for telemarketers—is currently being considered by the Federal Trade Commission.

Sites like Google, however, track information about their users not just to target advertisements, but to provide better services. An example would be a query about “Yankees”: if Google knows from past searches that a user is a Revolutionary War buff, the search engine can emphasize historical sites rather than baseball sites in the results it provides. This apparently small improvement, when applied across the entire search engine, can amount to the difference between going to a small library where the staff knows one’s interests and going to the main branch where one tends to remain a stranger. We have always traded a bit of our privacy in order to receive better service; if users reject data tracking to protect their privacy, it could be argued, they would have to give up other service improvements as well.

Google’s executives habitually speak of privacy in terms of these kinds of trade-offs, but the trade is more complicated than it may at first appear: websites certainly do need private information to provide high-level “functionality”—great search results, excellent e-mail services, useful social networks—but the private information collected for user services does not therefore have to be made available for commercial purposes such as advertising.

Fortunately, Google currently lets users opt out of targeted advertising, while still allowing data tracking for improved service to users. But there’s no guarantee that Google will continue to provide this option, and even if you do opt out of Google’s targeted advertising, the company may still use your private information for other commercial purposes. The most important step that regulators could take would be to permanently decouple the private information that sites need for personalized services—for example, helping you find a restaurant based on your current location—from the private information that sites may use for commercial purposes, such as informing advertisers of that location.

A “Do Not Track” list would likely be too simplistic an approach. What I would propose instead is something like a “Chinese Wall,” akin to the division maintained between the editorial and advertising departments of newspapers, or to the separation once mandated between commercial and investment banking (though hopefully more effective than both). The result would be a firm distinction between the data that companies must gather to provide helpful services and the personal data that companies may exploit for commercial purposes such as advertising. This solution would leave open the question of whether, in their zeal to provide better services, companies may learn a dangerously large amount about individuals, information that would be vulnerable to both theft and subpoena. But with such a clear division in place, Internet users would at least be able to opt out of data tracking for commercial purposes without degrading the level of service that they receive.

Google officials, in their discussions with the Federal Trade Commission last July, warned that even the limited policies being considered could “lock in a specific privacy model that may quickly become obsolete or insufficient due to the speed with which Internet services evolve.” While Google is certainly right that overly onerous regulations could limit innovation, it is hard to imagine in what circumstances a company could justify not allowing its users to opt out of having their personal information used for commercial purposes, should they feel that the tracking intrudes on their privacy. The speed with which the Internet evolves is precisely what makes the need for finding simple but durable regulations necessary. And while a regulation such as a “Chinese Wall”—or anything else effective enough to maintain privacy—would make it harder for sites to profit online, it might also go some way to protecting another value, privacy, that if left to profit-maximizing corporations could easily be eroded away.

—November 11, 2010

  1. 1

    Google, which entered the mainland Chinese market with a censored search engine in 2005, shut down its operations in January after discovering that hackers based in China had compromised the Gmail accounts of several Chinese human rights activists. The company subsequently redirected all traffic from its mainland China address, google.cn, to its uncensored Hong Kong address, google.com.hk. When users typed in “china.cn” they would, without taking any action, be instead shown the “google.com.hk” site. This is a simple technique, though rarely used at a such a high level. At first, the Chinese government responded by censoring the results from the Hong Kong search engine when it returned unapproved sites. Later the government objected to the way google.cn was redirected to google.com.hk, and threatened to revoke Google’s license to the google.cn address.

    Google, in what was widely reported as a “compromise,” stopped the automatic redirect. Now, when Chinese users type in “google.cn,” they see what appears to be the old Google search page, with a box where text can be entered and the standard Google search buttons. The page, however, is an artful subterfuge: when users attempt to click to enter text, they are again, almost before they realize what has happened, sent to Google’s uncensored Hong Kong search engine. The result is exemplary both of the remarkable subtlety that Google has shown so far in its approach to China (the company, for instance, while forced to censor sites by Chinese law, managed to slip through a far higher percentage of “subversive” sites than other search engines like Yahoo), as well as the tone-deafness among much of the media to such small but important technical details. 

  • Email
  • Single Page
  • Print