A Call for Input on Best Practices in Content Moderation
Irina Raicu is the director of the Internet Ethics Program at the Markkula Center for Applied Ethics. Views are her own.
Recently, both Facebook and Twitter responded to posts shared by President Trump and his campaign: Twitter hid a post that included a video clip in which the president said that children are “almost immune” to Covid-19, and barred his campaign’s account from tweeting until they took down the post (the account is now tweeting again); Facebook removed the post with the video from the president’s official page. Twitter also announced new policies around labeling state-sponsored media accounts. The content moderation environment is changing every day, amid a politically charged context in which election- and pandemic-related misinformation continues to spread, and active disinformation campaigns target people around the world.
Back in 2018, in conjunction with the first Content Moderation at Scale conference held at Santa Clara University, a group of academics and advocacy organizations developed the Santa Clara Principles on Transparency and Accountability in Content Moderation. (I was one of the original signatories.) The principles constitute a demand for transparency (calling for companies to disclose the number of posts taken down as part of content moderation efforts, broken down into multiple categories for clarity); notice (calling for companies to provide users with timely and detailed explanations for the removal of any of their posts); and appeal (proposing standards for meaningful and timely appeals of take-down decisions).
Some companies responded: as the Electronic Frontier Foundation noted in their “Who’s Got Your Back?” 2019 report on content moderation practices, 12 of the 16 companies featured in their analysis (including Facebook, Twitter, YouTube, Reddit, and LinkedIn) endorsed the Santa Clara Principles. However, the report added that “[s]ome content moderation best practices are seeing wider adoption than others.” For example, “[a]lthough providers increasingly offer users the ability to appeal content moderation decisions, they do not as consistently provide users with clear notice and transparency regarding their appeals processes.”
While much remains to be done in terms of meeting the initial demands of the Santa Clara Principles, their proponents are also calling for feedback on whether the principles should be updated to reflect “the ever-changing content moderation landscape.” They have issued a request for input, hoping, in particular, to hear “from groups and individuals from the Global South, and those who represent marginalized communities that are heavily impacted by commercial content moderation practices.” Any interested parties, however, are encouraged to submit their responses to a series of questions—by September 1st. Among those questions: “Are there regional, national, or cultural considerations that are not currently reflected in the Santa Clara Principles, but should be?” “Are the considerations for small and medium enterprises that are not currently reflected in the Santa Clara Principles, but should be?”
We recognize that many constituents and stakeholders of the Markkula Center for Applied Ethics are deeply interested and involved in issues related to the ethics of content moderation, and we strongly encourage them to send in their answers to those questions, and any additional input, in order to help shape the reassessment of the Santa Clara Principles.
Photo by Irina Raicu