Santa Clara University


Business Ethics in the News

Back to Blog

FACEBOOK: The Psychology Experiment You Consented to in FB's Terms of Service

Thursday, Jul. 17, 2014

Source: This is Public Health (Flickr)

For one week in 2012, half a million Facebook users took part in a massive psychological experiment aimed at discovering if emotions could be spread through social media. The problem? Users had no idea it was happening. It turns out Facebook routinely runs experiments on users; in fact every Facebook user has been a subject at some point, whether it be slight modifications in formatting or major feature changes.

Just about every Internet service does experiments, but this one altered users’ news feeds to highlight items with either positive or negative emotional content, and then measured if it affected the emotional content in each user’s future posts.

While it is agreed the experiment was legal, critics argue this type of testing crosses the line, particularly when consent is buried in a terms of service. Facebook researchers have taken to social media to apologize for the study, but the company’s official statement is that Facebook users agree to these types of experiments as part of the terms of service. Does Facebook need more explicit consent for this type of experiment? For all experiments?

  Kirk: The beauty in this unfortunate case is that it rests at the intersection of research ethics and business ethics. While every study involves influencing the subject's emotional state -- e.g. which color do people respond better to? -- this experiment went one step further by making emotional manipulation its sole purpose. The problem here is with the blanket consent that Facebook is hiding behind. While legally permissable, companies should act in the spirit of the law and ensure users know what they are getting into: especially with experiments that are this controversial. What right does Facebook have to know what I am feeling as I'm using their service? 

  Patrick: Let's not forget that Facebook is a for-profit company, offering a free service. We should all anticipate that Facebook will go to great lengths to monetize their product. A user's emotional state while using Facebook has direct implications for the amount of time they spend on the site and how interactive they are: both of which are critical to get companies to pay for advertising on Facebook. Yet there is still a concern that this experiment was beyond the pale: if emotions can spread through Facebook, can idealogies and political views as well? It's clear that the law is not just behind on regulating these emerging industries; it's also behind on regulating the experiments that shape their future.

Facebook Tinkers With Users' Emotions in News Feed Experiment, Stirring Outcry (NY Times)

Facebook Researcher's Apology

A Framework for Thinking Ethically (Markkula Center)


NEXT POST: When do startups have to grow up and embrace diversity?

Comments Comments

Rushi said on Jul 18, 2014
On the face of it, the Facebook exploited the intrinsic faith its users have on "honesty", some thing which seems to come easily along with non-personal nature of social media. But there is something to be said of the people who presented this case for discussion. The picture accompanying the case is itself misleading, designed to create negativity of a "psychological experiment". Thus, the guys who wrote and presented this case for discussion are themselves guilty of "manipulating" a response by giving an unconnected picture to link it with what FB did. As for manipulating emotions, be it social media or any sort of mass communications, is that not what it is basically intended to do? This particular one is simply perhaps validating some theories at FB and improving their strategies, a sort of market survey. Every one in any sort of public communications does it. While clearly specifying to FB users what was being dome, how different it is from what media, educationists, governments, and NGOs do? The reason why FB needs to be more transparent and explicit is the added responsibility which comes with sheer reach in terms of numbers, diversity, geographical spread of its users, and most being willing to share than what they may do in normal eye-ball communication. - Like - 4 people like this.
Patrick said on Jul 21, 2014
Rushi, that's an excellent point. I agree that the framing of the "study" was a misstep for Facebook; if anything because the lack of a consistent and upfront narrative created a void that was replaced with blog headlines. Nonetheless, the study goes beyond normal market research, I'd argue, in part because the findings were published an in academic setting claiming a development in social psychology. - Like - 2 people like this.
w+J said on Dec 9, 2014
why? - Like - 2 people like this.
shhhh!!!!!! said on Dec 10, 2014
this is a very long answer to a very easy question, its not illegal - Like - 1 person likes this.
Kevin Greenberg said on Nov 6, 2014
I believe that this proposes a very broad ethical question involving the obviousness of the terms in which you agree to be used in various consumer experiments without being informed you are being observed. I think that if the gave the information and you chose to accept it then it is up to you in order to agree or disagree. - Like - 2 people like this.
Sam Wallace said on Dec 9, 2014
The Facebook terms of service is an agreement stating what/ can happen during the use of the Facebook site. If people have chosen not to read these terms they should not be complaining about their lack of being informed. - Like - 1 person likes this.
Jace Carver and Walker Louthan said on Dec 9, 2014
This asks a separate question. although it is legal, should it be? they do ask if you agree but, if you disagree they won't allow use of the app. they need to be up front with their intentions instead of hiding them. Users deserve to have knowledge about surveys pertaining to their online profiles and information. On the other hand users should expect this type of situation when the download and use social media web sites. All in all the right thing to do would be to make sure that the users are aware of the surveys they are taking part in. - Like - 2 people like this.
Lindsay Allen said on Dec 14, 2014
I agree, they won't allow you to use the app. if you don't take the survey and that is not right. I feel it is another way to get more information about you than they are already getting from you. They do not go in depth like in fine print that is used. - Like
Ashwin. J said on Apr 26, 2015
From a legal standpoint Facebook did not violate any laws, but they were clearly aware that users are not going to read tens and twenties pages worth of service agreements before making a profile. Facebook took advantage of that fact and slipped in this clause that made us into test subjects. Ethically, each of us have rights and our emotional intelligence is our own and ours to keep and use as we deem fit. Facebook should have more explicitly asked its users to partake in such an experiment. - Like
Post a Comment