Skip to main content
Markkula Center for Applied Ethics

Ethical Uses of Collected Data

Margaret Steen
A panel of experts discuss ethical use of collected data

L-R: Scott Shipman, MeMe Jacobs Rasmussen, Shannon Vallor, and Irina Raicu

Now that so-called big data is providing access to information that would not previously have been discovered, what are the ethical boundaries around companies’ use of this data?

A panel of experts discussed “Ethical Use of Collected Data” as part of the Business and Organizational Ethics Partnership at Santa Clara University’s Markkula Center for Applied Ethics.

The panelists included Shannon Vallor, associate professor of philosophy at Santa Clara University; Scott Shipman, general counsel and chief privacy officer at Sensity Systems; and MeMe Jacobs Rasmussen, chief privacy officer at Adobe Systems.

The panel was moderated by Irina Raicu, director of Internet ethics at the Ethics Center.

Vallor opened the discussion with an explanation of how big data is used and some of the ethical issues it raises.

“The applications are virtually boundless, given that consumers are generating and we are collecting and storing unprecedented volumes of data in all sectors: private, public, heath care, education, commerce and entertainment,” Vallor said. “We are being overwhelmed by the promise of big data to solve persistent challenges in public health, criminal justice” and other areas.

But there are risks as well as benefits to using big data. Privacy is one of the biggest issues, given the volume of data that is being collected. Consumers are justifiably concerned not only about what the company that collects the data will do with it, but also how it will be protected from third parties.

The ethical issues go well beyond privacy, Vallor said. “It’s easy to lose sight of that because privacy is so significant and challenging.”

For example, does big data offer a fair distribution of both risks and benefits? Will it exacerbate the digital divide, with consumer needs and desires being determined based on what those who own digital devices want?

Accuracy and reliability are also issues, especially when institutions decide to make important decisions based on the data. “The principle of garbage in, garbage out applies to big databases as well as small databases,” Vallor said.

Finally, Vallor said, big data could be used to discriminate, with “analytics as a quick and dirty form of redlining.” Certain types of people could be classified as poor risks for employment, health care interventions, or educational services, for example.

Rasmussen, who works closely with Adobe’s digital marketing business, pointed to both risks and benefits of big data when describing her job. “We do collect a lot of data, but we retain very limited rights to use that data,” she said. “My team works with the business units to understand what they are doing with the data and to advise them on how to develop the products in a way that is privacy-focused.”

However, Adobe requests permission to aggregate data from its customers, providing information that would not be otherwise available, such as how many PCs, tablets and mobile devices are accessing websites.

Adobe tries to follow the dictum, “Say what you do, do what you say, and don’t surprise the user,” Rasmussen said. “’Don’t surprise the user’ sounded really good to me when I first started in this job. But I’ve learned over the years that transparency is hard when you’ve got complicated products. Figuring out whether what you’re doing will surprise the user is often difficult.”

“We might draft a privacy policy that says we collect XX and do YY,” Rasmussen said. “Right after you publish it, the product team may come to you and say, ‘Can we do ZZ as well?’ You don’t want to stop innovation, but you can’t just keep revising your privacy policy.”

Shipman, too, has a job that illustrates both the promise and the risks of using big data. Sensity is an Internet of Things company, capitalizing on the transition to LED lighting to create Light Sensory Networks that will allow cities and other entities to deliver both energy-efficient lighting and a real-time, global database of information that allows customers to manage and understand their physical environment for greater productivity, efficiency and security. He was hired as chief privacy officer to build and govern a global privacy program, in which he will establish data protection standards and lead industry-wide privacy initiatives.

One of the issues, Shipman said, is defining the audience. For companies like his, the customers — those who purchase the product — are not the only ones affected by it. “To improve operations and efficiency of a city, for example, it’s important to understand how city assets are used – roads, parking spots, utilities, etc. This means collecting data on behalf of our customer – the city – from the end user who is not our customer. Often it’s the non-customer that is misinformed,” he said. “You have to keep those perceptions in mind as well.”

One challenge is that it’s difficult for everyone involved to envision either the benefits or the harms inherent in a use of big data, Shipman said. “Consumers expect companies to innovate. Seeking permission for every benefit would restrict innovation and limit the benefits.”

One question from the audience centered around whether consumers are actually as concerned about privacy as they say they are, given that they often don’t bother to use encryption. Raicu said studies have shown people give up less data once they know what’s being collected, which suggests they do value privacy. And Vallor said her students are increasingly using apps like Snapchat, which they view as protecting their privacy.

“It’s not true that people either protect themselves from all harm or don’t care about harm,” Vallor said. “The fact that there are people out there who are knowingly allowing their data to be collected doesn’t mean that those people expect to be harmed and aren’t going to be outraged when they are.”

On the other hand, Rasmussen said, some people are less concerned the more they learn about how data is collected and used. They may prefer to see ads that are relevant to them, for example, even though it means their online behavior is being tracked.

Vallor noted that predicting people’s responses will never be fully accurate. As a strategy, not surprising the user “has got its limitations, and then there has to be another strategy: How do we manage the situation when the user is surprised – or when we’re surprised?”

And Raicu noted that there could be a “chilling effect if people are discouraged from using technology because of privacy concerns.”

Margaret Steen is a freelance author.

Nov 30, 2015
--