Skip to main content
Markkula Center for Applied Ethics

Facebook’s Ethically Incoherent Response to Manipulated Content

Anita Varma

AP Photo/Eric Risberg

Anita Varma is the program manager for Journalism & Media Ethics as well as Business, Leadership, and Social Sector Ethics at the Markkula Center for Applied Ethics. Views are her own.

Manipulated videos of Speaker of the House Nancy Pelosi have brought Facebook back under fire for not removing politically incendiary misinformation from its platform. Facebook opted instead to provide users with context from fact-checking networks, and said that they would not promote the video in the News Feed – but maintained that the content would remain on the platform.

When pressed to justify why in an interview with CNN’s Anderson Cooper, Facebook Vice President for Product Policy and Counterterrorism Monika Bickert stuck to a consistent and ethically troubling talking point: it’s up to people to decide for themselves.

“Why keep it up?” Cooper asked.

“We think it’s important for people to make their own informed choice about what to believe,” Bickert replied.

Curiously, Bickert was insistent that Facebook was providing users with abundant notification that the video was false, yet also said that Facebook should not decide what is true or false. Doing so, she said with incredulity, would amount to asking “a technology company to be the decider of what is true and what is false.”

Instead, Bickert asserted, Facebook’s role is to act as a conduit for independent fact-checkers “and then put that information in front of people so that they can make an informed choice.”

Bickert’s refrain that individuals should decide for themselves mirrors the questionable logic of responsibilization. A neoliberal concept, responsibilization means “subjects are rendered individually responsible for a task which previously would have been the duty of another – usually a state agency.” With echoes of invocations of media literacy as a remedy for misinformation, responsibilization shifts duty from “higher authorities to…individuals who are then called on to take an active role in resolving their own problems.”

From the perspective of a company like Facebook, rhetoric of responsibilization provides an excellent escape hatch from the more gnarled question of whether Facebook’s reach and political content is harming society. Instead, their argument seems to be that it’s up to people to decide – in which case Facebook is not falling short of its responsibility but actually excelling by providing people with additional context to make decisions for themselves.

From the perspective of people living in a Facebook-fractured world, however, Facebook’s response sounds disconcertingly like a chemical company dumping hazardous waste in a residential neighborhood and then declaring it residents’ responsibility to move if they are bothered by getting cancer.

Abdicating responsibility for a problem is often an effective public relations tactic, but is a far cry from public accountability. Throughout the interview and elsewhere, Facebook’s representatives have persistently sidestepped the ethical question of whether political content that is verifiably false should be disseminated on a platform with billions of users. Facebook’s polished refusal to reckon with the ethical question of its role in misinformation is predictable, if frustrating, and their strategic use of responsibilization suggests that it is time to abandon any last remnants of hope that Facebook could be an instrument for social good.

In the absence of Facebook being willing to acknowledge and act upon its central role as far more than a common carrier or innocent conduit in the misinformation landscape, we need to recognize Facebook for what it is and seems to insist on remaining: a platform that fosters misinformation by design, festers hate, and placidly insists that this is not only fine but also a desirable state of affairs.

Moving from bewildering to bizarre, Bickert’s characterization of what kind of business Facebook is in provided a final nail in the coffin against viewing Facebook as fighting misinformation rather than proliferating it. Asked about Facebook’s responsibility as a news business, Bickert replied, “We aren’t in the news business, we’re in the social media business.”

If Facebook is truly not in the news business, then its community standards should be updated to prohibit the sharing of news content on the platform. Furthermore, if Facebook refuses to provide coherent justification for their actions (that do not depend on a laughable attempt to minimize their role), then it seems long past due that people – whose responsibility, it apparently is, to decide for themselves – get out of Facebook’s social media business before its societal, political, and cultural harms take us past the point of no return.

May 31, 2019
--

Subscribe to Our Blogs

* indicates required
Subscribe me to the following blogs: