Skip to main content
Markkula Center for Applied Ethics

Update Your Terms and Conditions

Maria Lutgarda Glorioso

Maria Lutgarda Glorioso

Maria Lutgarda Glorioso

As I scoured the internet for Facebook/ Zuckerberg/ Congressional Hearings/ Civil Discourse issues, I kept thinking about a visit to Burger King back in December 2017. Why Burger King? I hardly eat fast food. However, in this one instance in December 2017 I craved a vegetarian burger from Burger King. Phone in hand, I briefly mentioned to my partner that I wanted to drive to a Burger King. As we waited in line at the drive through, I scrolled through my Facebook feed. Lo and behold, a Burger King advertisement popped up. Was I surprised? Not really. I had never searched Burger King on a search engine or clicked on any related content. Simply, as we now know, Facebook listened to everything I said via my cell phone microphone and picked up on me saying “Burger King” and promptly followed with an advertisement. Aside from the infringement on my privacy, this experience exemplifies the issue of civil discourse. Facebook responded to my real life conversation and presented my online presence with something that it thought might appeal to me. Facebook’s clever (and invasive) marketing tool is what has now become the center of public debate. It creates echo chambers based off of the information it collects from our online activity outside of the website, like through microphone captured conversations, search engine queries, and visits to other websites. Recently, many of the applications on my phone have sent me notifications regarding updates to their terms and conditions. I believe that as consumers, we ought to update our own personal terms and conditions by demanding social media platforms- like Facebook- to act as an arbiter of truth and civil discourse.

Facebook as an Accelerant

In the US, this has proven to be an issue with civil discourse via the presidential election. Certain stories and posts were given much more attention than others, providing biased exposure to certain topics. Most recently, conflict in Sri Lanka published in The New York Times highlight the issues around social media creating “alternate realities.” In this article, fake memes and posts circulated among Buddhist Sri Lankans turning them against Muslims. The examples presented in this article demonstrate the integral role that Facebook plays in igniting conflict by its lack of flagging inflammatory content. Many people flagged the fake posts but Facebook did not respond in a timely manner but when it finally did it claimed that such posts did not violate community standards. The level of violence that sprang out of conflicting identity politics goes to show the power that Facebook has had on civil discourse and reality as a whole. Similarly, This blog discusses the implications of Facebook posts of the Rohingya genocide in Myanmar. Many attribute the lack of exposure to the genocide to the propaganda of the Myanmar government. The government denies participation and instead deflect blame on the Rohingyas for wide-scale violence, displacement, and death. In addition, an open-public letter to Facebook from Myanmar civil organizations outline the primary issues with facebook as:

 

  • “an over-reliance on third parties (to flag content, if they came across it in time, rather than monitoring for such content itself);
  • a lack of proper emergency escalation mechanism (it had taken days for Facebook to step in after the organizations had tried to raise concerns about the messages, and they went viral in the meantime);
  • lack of engagement with local stakeholders (requests to talk to Facebook’s engineering and data teams about systemic solutions had gone unanswered); and
  • a lack of transparency (seven months after an incident Zuckerberg cited as a success story because Facebook had blocked a series of messages inciting specific violence, the organizations still did not know the details of what had happened).”

 

Update to Our Terms and Conditions

            The cases of Sri Lanka and Myanmar are stark examples of bias and issues with civil discourse. How can we, as informed citizens, make educated decisions when Facebook, a primary source of up-to-date news and events only reveals one side of the story? We must be weary of the information that is shared and disseminated by Facebook and our Facebook friends. Facebook algorithms put certain friends on our news feeds more than others for a reason. We might be trapped in echo chambers that we do not want to be a part of yet we find no way out. To be well-informed and educated about current events, we must also be critical of the medium and the messenger. It is essential to understand the politics around Facebook and social media and how that affects the way that we communicate with others in real life. I propose that we, as a community, update our terms and conditions and demand more transparency and action from social media to prevent fake news, “alternative facts,” and unsound posts from circulating.

May 8, 2018
--

Subscribe to The Power of Our Voices

* indicates required