The Right to Be Forgotten or the Right to Edit?
A ruling by the European Court of Justice sparks an international discussion of whether people have the right to eliminate items about themselves from search engine results.
In May 2014, the European Court of Justice ruled that Google will have to comply with some European citizens' requests that it remove from its search engine links to some information about them. The immediate reaction to the decision, at least in the U.S., was all about polarity: E.U. vs. U.S.; privacy vs. free speech; full and accurate information v. censored results. Either-or.
In this narrative, there has been very little discussion of the "Eraser button" law recently passed by the state of California-home of Google, and one of the top 10 economies of the world. The California law requires social media platforms to delete content posted by teenagers, when the teenage authors request that. This "eraser button" has been criticized by some as ineffective and unnecessary, and praised by others as a more "modest" and appropriate version of the unrealistically-named "Right to Be Forgotten." Either way, the California law proves that the American public has also been struggling to find the right balance on this topic-the right balance between various values-and is not simply content with the status quo that was recently upended by the European court.
Much of the coverage of the court's decision has also presented it as a novel restriction on freedom of expression: the editing of history, the censoring of reality. And it has presented this issue as if search engines didn't already make decisions about what to include, and in what order, in their responses to queries from users. As if Google's responses to searches were, until now, simply an objective, orderly, historically accurate presentation of everything that exists on the Web. One article, for example, pointed out that "in the short term, users in different regions of the world may see very different search results in some cases." Of course, the reality is that even users in a single region of the world (or a single room) are already likely to be shown very different search results--unless they've actively taken steps to turn off the "personalized search" that Google implements as its default setting. Moreover, prior to the recent court decision, Google had already been adjusting its search results to comply with some countries' laws (France and Germany, for example).
If we worry about the editing of history and lack of uniformity in search results, we should have been worried about the power of the search engines long before the European court's decision.
This worry was reflected in a different warning about the impact of the decision, expressed by Joris van Hoboken, a Fellow at NYU's Information Law Institute. He argues that the real problem is that one search engine, Google,
is dominant on the European web... [and now] that all the pressure is put on Google to filter stuff, they gain even more power. If Google is always the one that has to decide what is or isn't public information, and accessible through their search engine, then Google becomes even more the editor. I think there are already enough legitimate concerns about Googles cultural influence, but this ruling doesn't take away any of those. It actually increases them.
That statement assumes, however, that the filtering decision will be left entirely up to Google. In fact, representatives of the various European data protection agencies have said that they intend to come up with a common set of standards that would apply to requests for link removals. And users whose requests are denied by Google may appeal those decisions to local regulators.
Even if Google (and other search engines) are now indeed placed in a position to make more decisions about what links should be deleted from search results, the fact remains that these decisions would be prompted, for the first time, by requests from individuals whose information is at stake.
Immediately after the European Court of Justice's ruling was announced, Trevor Hughes, president and CEO of the International Association of Privacy Professionals, was cited as saying, "Individuals now have the ability to essentially go in with a virtual black marker and redact their names." The redacting "black marker" is a powerful image, but search engines and governments have always held-and wielded-that marker. The court's decision forces us to face a different question: should individuals, too, have the right to at least attempt to edit search results about themselves?
The European Court of Justice held that an individual's rights, at least in some circumstances, override "the economic interest of the operator of the search engine"-and that, in some cases, an individual's rights also override "the interest of the general public" in access to information about that individual. What happens when the rights of the individual, when it comes to information about oneself, clash with the rights of the public? This is an ethical dilemma-again a subject of ongoing debates both among different countries and within the U.S..
So the story is more complicated than "E.U. vs. U.S." or "privacy vs. free speech." It is, partly, about who has the power to edit on the Internet. And you can disagree with the particular balance struck by the European court and still hope-and search for-a better balance than that presented by the prior status quo. California is trying to find that balance; others states might follow with their own proposals. In the meantime, Google is starting to comply with some of the requests that certain links be removed from European search results.
Irina Raicu is the director of Internet Ethics at the Markkula Center for Applied Ethics at Santa Clara University.
Jun 1, 2014
On personal data, personalized advertising, and pain
How can we change online practices that lead to marketing that's both intrusive and inaccurate?
An upcoming talk by journalist Julia Angwin
The criminal justice system is one of many contexts currently impacted by algorithmic decision-making. The notion of “algorithmic accountability,” however, is a developing concept.
Internet access is, increasingly, a necessity
How might we make internet access—and digital literacy education—readily accessible to all low-income residents of Silicon Valley and the rest of the state?