Santa Clara University

Internet Ethics header
Bookmark and Share
 
 
RSS

Ethical Issues in the Online World

Welcome to the blog of the Internet Ethics program at the Markkula Center for Applied Ethics, Santa Clara University. Program Director Irina Raicu will be joined by various guests in discussing the ethical issues that arise continuously on the Internet; we hope to host a robust conversation about them, and we look forward to your comments.

The following postings have been filtered by tag right to be forgotten. clear filter
  •  Revisiting the "Right to Be Forgotten"

    Tuesday, Sep. 16, 2014

     

    Media coverage of the implementation of the European Court decision on de-indexing certain search results has been less pervasive than the initial reporting on the decision itself, back in May.  At the time, much of the coverage had framed the issue in terms of clashing pairs: E.U. versus U.S; privacy versus free speech.  In The Guardian, an excellent overview of the decision described the “right to be forgotten” as a “cultural shibboleth.”

    (I wrote about it back then, too, arguing that many of the stories about it were rife with mischaracterizations and false dilemmas.)

    Since then, most of the conversation about online “forgetting” seems to have continued on parallel tracks—although with somewhat different clashing camps.  On one hand, many journalists and other critics of the decision (on both sides of the Atlantic) have made sweeping claims about a resulting “Internet riddled with memory holes” and articles “scrubbed from search results”; one commentator wrote that the court decision raises the question, can you really have freedom of speech if no one can hear what you are saying?” 

    On the other hand, privacy advocates (again on both sides of the Atlantic) have been arguing that the decision is much narrower in scope than has generally been portrayed and that it does not destroy free speech; that Google is not, in fact, the sole and ultimate arbiter of the determinations involved in the implementation of the decision; and that even prior to the court’s decision Google search results were selective, curated, and influenced by various countries’ laws.  Recently, FTC Commissioner Julie Brill urged “thought leaders on both sides of the Atlantic to recognize that, just as we both deeply value freedom of expression, we also have shared values concerning relevance in personal information in the digital age.”

    Amid this debate, in late June, Google developed and started to use its own process for complying with the decision.  But Google has also convened an advisory council that will take several months to consider evidence (including public input from meetings held in seven European capitals--Madrid, Rome, Paris, Warsaw, Berlin, London, and Brussels), before producing a report that would inform the company’s current efforts.  Explaining the creation of the council, the company noted that it is now required to balance “on a case-by-case basis, an individual’s right to be forgotten with the public’s right to information,” and added, “We want to strike this balance right. This obligation is a new and difficult challenge for us, and we’re seeking advice on the principles Google ought to apply…. That’s why we’re convening a council of experts.”

    The advisory council (to whom any and all can submit comments) has been posting videos of the public meetings online. However, critics have taken issue with the group’s members (selected by Google itself), with the other presenters invited to participate at the meetings (again screened and chosen by Google), and, most recently, with its alleged rebuffing of questions from the general public. So far, many of the speakers invited to the meetings have raised questions about the appropriateness of the decision itself.

    In this context, one bit of evidence makes its own public comment:  Since May, according to Google, the company has received more than 120,000 de-indexing requests. Tens of thousands of Europeans have gone through the trouble of submitting a form and the related information necessary to request that a search of their name not include certain results.  

    And, perhaps surprisingly (especially given most the coverage of the decision in the U.S.), a recent poll of American Internet users, by an IT security research firmfound that a “solid majority” of them—61%--were “in favor of a ‘right to be forgotten’ law for US citizens.”

    But this, too, may speak differently to different audiences. Some will see it as evidence of a vast pent-up need that had had no outlet until now. Others will see it as evidence of the tens of thousands of restrictions and “holes” that will soon open up in the Web.

    So—should we worry about the impending “memory holes”?

    In a talk entitled “The Internet with a Human Face,” American Web developer Maciej Ceglowski argues that “the Internet somehow contrives to remember too much and too little at the same time.” He adds,

    in our elementary schools in America, if we did something particularly heinous, they had a special way of threatening you. They would say: “This is going on your permanent record.”

    It was pretty scary. I had never seen a permanent record, but I knew exactly what it must look like. It was bright red, thick, tied with twine. Full of official stamps.

    The permanent record would follow you through life, and whenever you changed schools, or looked for a job or moved to a new house, people would see the shameful things you had done in fifth grade. 

    How wonderful it felt when I first realized the permanent record didn’t exist. They were bluffing! Nothing I did was going to matter! We were free!

    And then when I grew up, I helped build it for real.

    But while a version the “permanent record” is now real, it is also true that much content on the Internet is already ephemeral. The phenomenon of “link rot,” for example, affects even important legal documents.  And U.K. law professor Paul Bernal has argued that we should understand the Internet as “organic, growing and changing all the time,” and that it’s a good thing that this is so. “Having ways to delete information [online] isn’t the enemy of the Internet of the people,” Bernal writes, “much as an enemy of the big players of the Internet.”

    Will Google, one of the “big players on the internet,” hear such views, too? It remains to be seen; Google’s “European grand tour,” as another UK law professor has dubbed it, will conclude on November 4th

     

    Photograph by derekb, unmodified, under a Creative Commons license. https://creativecommons.org/licenses/by-nc/2.0/legalcode

  •  The Right to Be Forgotten

    Monday, Mar. 25, 2013

    "Total interconnectedness," very cheap data storage, and powerful search technologies come together to create a new set of ethical questions.  Do we have a right to access and correct the data in our profiles?  Do we have a right to be "forgotten" by the Internet?  In this brief video, Reputation.com co-founder Owen Tripp asks us to consider the impact of the Internet's long memory on those among us who are most vulnerable.  Below, Evan Selinger--Associate Professor in the Department of Philosophy at the Rochester Institute of Technology--responds to Tripp's comments:

    "Owen Tripp is moved by the ideas driving the "right to be forgotten" movements. For the reasons he gives, we all should be, too.  In the age of big data, the permanent record threat we're confronted with as kids takes on a new and more ominous meaning.  Our digital dossiers expand all the time, in both obvious and unclear ways, and through processes that are transparent as well as surreptitious.  Now that unprecedented amounts of information are readily available about what we've done and what makes us tick, lamentable incidents and statements can follow us everywhere with the crushing weight of Jacob Marley's chains. With the past always present, time--as Shakespeare's Hamlet exclaimed--is out of joint.

    When citizens become open books, it becomes awfully tempting to manage heightened publicity with overly cautious and risk-adverse behavior.  With enough fear, we'll lose out on more than opportunity. Our character can be diminished, perhaps timorousness shifting from vice to virtue.  As David Hoffman, Director of Security Policy and Global Privacy Officer at Intel Corporation, contends, society thus needs solutions that safeguard a limited "right to fail" without encouraging reckless or anti-social behavior, or the problems that come from historical amnesia or revisionism.  At stake is nothing less than securing adequate space for social experimentation, the "breathing room" (to borrow a phrase from privacy scholar Julie Cohen) that enables people to learn and grow.

    While the right to be forgotten appears to be gaining traction in Europe, there are numerous challenges ahead, not least because the road from privacy interest to privacy right can be long and winding.  In the United States concern has been expressed over how legal enforcement of a robust right for individuals to control personal information could run afoul of First Amendment speech protections and squash innovation by subjecting companies like Google and Facebook to bureaucratic procedures that, practically speaking, are unworkable, and further burdened by the prospect of overly punitive sanctions.  Furthermore, as numerous scholars suggest, the notion of so-called "personal information" is hard to pin down in an age of networked citizens where lots of data involves or affects other people, implicating what law professor Sonja West aptly calls the "story of us." Finally, while the market can indeed provide helpful services, we shouldn't lose sight of the fact that when privacy protection is commodified, greater burden is placed on lower income people.  Freedom and peace of mind become purchasing power privilege."

    Evan Selinger -- Twitter: @evanselinger