Santa Clara University

Bookmark and Share

Ethical Issues in the Online World

Welcome to the blog of the Internet Ethics program at the Markkula Center for Applied Ethics, Santa Clara University. Program Director Irina Raicu will be joined by various guests in discussing the ethical issues that arise continuously on the Internet; we hope to host a robust conversation about them, and we look forward to your comments.

The following postings have been filtered by tag data. clear filter
  •  Metaphors of Big Data

    Friday, Nov. 6, 2015

    Hot off the (digital) presses: This morning, Re/code ran a short piece by me, titled “Metaphors of Big Data.”

    In the essay, I argue that the metaphors currently used to describe “big data” fail to acknowledge the vast variety of vast datasets that are now being collected and processed.  I argue that we need new metaphors.

    Strangers have long had access to some details about most of us—our names, phone numbers and even addresses have been fairly easy to find, even before the advent of the Internet. And marketers have long created, bought and sold lists that grouped customers based on various differentiating criteria. But marketers didn’t use to have access to, say, our search topics, back when we were searching in libraries, not Googling. The post office didn’t ask us to agree that it was allowed to open our letters and scan them for keywords that would then be sold to marketers that wanted to reach us with more accurately personalized offers. We would have balked. We should balk now.

    The link will take you to the piece on the Re/code site, but I hope you’ll come back and respond to it in the blog comments!


    Photo by Marc_Smith, used without modification under a Creative Commons license.


  •  A Personal Privacy Policy

    Wednesday, Sep. 2, 2015

    This essay first appeared in Slate's Future Tense blog in July 2015.

    Dear Corporation,

    You have expressed an interest in collecting personal information about me. (This interest may have been expressed by implication, in case you were attempting to collect such data without notifying me first.) Since you have told me repeatedly that personalization is a great benefit, and that advertising, search results, news, and other services should be tailored to my individual needs and desires, I’ve decided that I should also have my own personalized, targeted privacy policy. Here it is.

    While I am glad that (as you stated) my privacy is very important to you, it’s even more important to me. The intent of this policy is to inform you how you may collect, use, and dispose of personal information about me.

    By collecting any such information about me, you are agreeing to the terms below. These terms may change from time to time, especially as I find out more about ways in which personal information about me is actually used and I think more about the implications of those uses.

    Note: You will be asked to provide some information about yourself. Providing false information will constitute a violation of this agreement.

    Scope: This policy covers only me. It does not apply to related entities that I do not own or control, such as my friends, my children, or my husband.

    Age restriction and parental participation: Please specify if you are a startup; if so, note how long you’ve been in business. Please include the ages of the founders/innovators who came up with your product and your business model. Please also include the ages of any investors who have asserted, through their investment in your company, that they thought this product or service was a good idea.

    Information about you. For each piece of personal information about me that you wish to collect, analyze, and store, you must first disclose the following: a) Do you need this particular piece of information in order for your product/service to work for me? If not, you are not authorized to collect it. If yes, please explain how this piece of information is necessary for your product to work for me. b) What types of analytics do you intend to do perform with this information? c) Will you share this piece of information with anyone outside your company? If so, list each entity with which you intend to share it, and for what purpose; you must update this disclosure every time you add a new third party with which you’d like to share. d) Will you make efforts to anonymize the personal information that you’re collecting? e) Are you aware of the research that shows that anonymization doesn’t really work because it’s easy to put together information from several categories and/or several databases and so figure out the identity of an “anonymous” source of data? f) How long will you retain this particular piece of information about me? g) If I ask you to delete it, will you, and if so, how quickly? Note: by “delete” I don’t mean “make it invisible to others”—I mean “get it out of your system entirely.”

    Please be advised that, like these terms, the information I’ve provided to you may change, too: I may switch electronic devices; change my legal name; have more children; move to a different town; experiment with various political or religious affiliations; buy products that I may or may not like, just to try something new or to give to someone else; etc. These terms (as amended as needed) will apply to any new data that you may collect about me in the future: your continued use of personal information about me constitutes your acceptance of this.

    And, of course, I reserve all rights not expressly granted to you.

    Photo by Perspecsys Photos, used without modification under a Creative Commons license.

  •  “Practically as an accident”: on “social facts” and the common good

    Thursday, Oct. 30, 2014


    In the Los Angeles Review of Books, philosopher Evan Selinger takes issue with many of the conclusions (and built-in assumptions) compiled in Dataclysm—a new book by Christian Rudder, who co-founded the dating site OKCupid and now heads the site’s data analytics team. While Selinger’s whole essay is really interesting, I was particularly struck by his comments on big data and privacy. 

    “My biggest issue with Dataclysm,” Selinger writes,
    lies with Rudder’s treatment of surveillance. Early on in the book he writes: ‘If Big Data’s two running stories have been surveillance and money, for the last three years I’ve been working on a third: the human story.’ This claim about pursuing a third path isn’t true. Dataclysm itself is a work of social surveillance.
    It’s tempting to think that different types of surveillance can be distinguished from one another in neat and clear ways. If this were the case, we could say that government surveillance only occurs when organizations like the National Security Agency do their job; corporate surveillance is only conducted by companies like Facebook who want to know what we’re doing so that they effectively monetize our data and devise strategies to make us more deeply engaged with their platform; and social surveillance only takes place in peer-to-peer situations, like parents monitoring their children’s phones, romantic partners scrutinizing each other’s social media feeds….
    But in reality, surveillance is defined by fluid categories.
    While each category of surveillance might include both ethical and unethical practices, the point is that the boundaries separating the categories are porous, and the harms associated with surveillance might seep across all of them.
    Increasingly, when corporations like OKCupid or Facebook analyze their users’ data and communications in order to uncover “social facts,” they claim to be acting in the interest of the common good, rather than pursuing self-serving goals. They claim to give us clear windows into our society. The subtitle of Rudder’s book, for example, is “Who We Are (When We Think No One’s Looking).” As Selinger notes,
    Rudder portrays the volume of information… as a gift that can reveal the truth of who we really are. … [W]hen people don’t realize they’re lab rats in Rudder’s social experiments, they reveal habits—‘universals,’ he even alleges…  ‘Practically as an accident,’ Rudder claims, digital data can now show us how we fight, how we love, how we age, who we are, and how we’re changing.’
    Of course, Rudder should contain his claims to the “we” who use OKCupid (a 2013 study by the Pew Research Trust found that 10% of Americans report having used an online dating service). Facebook has a stronger claim to having a user base that reflects all of “us.”  But there are other entities that sit on even vaster data troves than Facebook’s, even more representative of U.S. society overall. What if a governmental organization were to decide to pursue the same selfless goals, after carefully ensuring that the data involved would be carefully anonymized and presented only in the aggregate (akin to what Rudder claims to have done)?
    In the interest of better “social facts,” of greater insight into our collective mindsets and behaviors, should we encourage (or indeed demand from) the NSA to publish “Who Americans Are (Whey They Think No One’s Watching)”? To be followed, perhaps, by a series of “Who [Insert Various Other Nationalities] Are (When They Think No One’s Watching)”? Think of all the social insights and common good that would come from that!
    In all seriousness, as Selinger rightly points out, the surveillance behind such no-notice-no-consent research comes at great cost to society:
    Rudder’s violation of the initial contextual integrity [underpinning the collection of OKCupid user data] puts personal data to questionable secondary, social use. The use is questionable because privacy isn’t only about protecting personal information. People also have privacy interests in being able to communicate with others without feeling anxious about being excessively monitored. … [T]he resulting apprehension inhibits speech, stunts personal growth, and possibly even disinclines people from experimenting with politically relevant ideas.
    With every book subtitled “Who We Are (When We Think No One’s Looking),” we, the real we, become more weary, more likely to assume that someone’s always looking. And as many members of societies that have lived with excessive surveillance have attested, that’s not a path to achieving the good life.
    Photo by Henning Muhlinghaus, used without modification under a Creative Commons license.


  •  Mobile Technology and Social Media: Ethical Implications

    Sunday, May. 12, 2013

    The adoption of mobile devices and the use of social media are both growing quickly around the world.  In emerging markets in particular, mobile devices have become “life tools”—used for telemedicine, banking, education, communication, and more.  These developments give rise to new ethical challenges.  How should the mobile be used for data collection among vulnerable populations?  Can apps that bring great benefits also cause unintended harm?  And who should address these concerns?  In this brief video, tech entrepreneur and professor Radha Basu argues that the debate should include the manufacturers of mobile devices and the app developers, but also the young people who will be most affected by these new developments.