Skip to main content
Photo of trademarks of several social media companies.

Photo of trademarks of several social media companies.

Banning Trump on Social Media

Do Internet services have the legal discretion to terminate or remove users' accounts?

Internet services routinely terminate users’ accounts and remove their content for violating those companies’ content rules and policies. But the stakes got even higher when many of them banned former President Donald Trump and some of his allies after the Jan. 6 insurrection at the U.S. Capitol.

Trump’s unfounded allegations of voter fraud and denial of his 2020 election loss that helped incite the deadly assault—and concerns that he would instigate further violence—triggered decisions by the social media companies to “deplatform” him.

A lawsuit Trump filed in July against Facebook, Twitter and YouTube alleging he and others have been wrongfully censored happens to be the sweet spot for Santa Clara University Law Professor Eric Goldman, who addresses the issue in an Aug. 27 article in the Journal of Free Speech Law.

The nationally renowned tech law expert and scholar, together with Jess Miers J.D. ’21, explore two questions:

1) Do Internet services have the legal discretion to terminate and remove users' accounts?
2) What would happen if the government restricted or removed that discretion?


In their review of 62 U.S. judicial opinions involving Internet services’ user account terminations and content removals, the pair found that Internet services succeeded in court, confirming the companies’ legal freedom to enforce their private editorial policies, or so-called “house rules.” 

But Trump is among many, including numerous regulators, who want to change the legal status quo and restrict that editorial freedom. Yet, as Goldman and Miers conclude, instead of promoting free speech, that legal revision would counterproductively reduce the number of voices who get to speak online. Thus, laws imposing “must-carry” requirements on Internet services to “carry” user content—when the Internet service would otherwise choose not to—will exacerbate the problem they purport to solve.

In a recent conversation with Professor Goldman, we asked him to talk about the new research, which as non-lawyers, we found easy to read and digest.

On the whole, many people believe de-platforming Trump was good for democracy. But critics say, “Today it's Trump; who will it be tomorrow?” Should that be a legitimate concern for Americans—of all stripes?

I don't think it's a legitimate concern. "Deplatforming" Trump didn't take away Trump's power to speak online. Indeed, he has found other outlets to share his message. That's true for Americans of all stripes. Trump valued access to Twitter's audience, but he refused to play by their rules. So he had to pick one of two choices: play by Twitter's rules or lose access to the audience. He chose the latter. The more serious situation is if someone gets kicked off a platform like, say, Zoom, but their university or employer requires them to be on Zoom. That's a totally different dynamic than Trump, who is incapable of adhering to other people's rules and then complains about how unfair it is that he suffers the predictable consequences. 

Yet, days after Trump was banned from Twitter in January, a Slate article asked: Do we need new laws to govern the platforms—laws ensuring that the decisions they make regarding speech are more transparent, less ad hoc, and reflect values beyond the companies’ motivation to make money? 

I don't favor these rules. Internet services are private publishers making editorial decisions. Just like we wouldn't tell Fox News or the New York Times how to run their editorial operations, we shouldn't tell private Internet services either. Indeed, I am far more afraid of the government controlling publishers than I am of any editorial decisions that publishers independently make. 

Your article notes that must-carry rules would also do something many Internet users hadn't considered, which is removing the ability of Internet services to create advertiser-friendly environments. “As the quality of user content drops, advertiser dollars will migrate to safer advertising venues. Thus, must-carry rules guarantee that Internet services’ advertising revenues will drop or disappear.” Whoops!

If legislators were concerned about making good policy, they would pay close attention to the ways that their laws affect business models and the countermeasures that businesses will take in response to regulation. Because many legislators have abdicated these fundamental responsibilities to their constituents, we see far too many regulatory proposals that will have the obvious effect of undermining the structure of the Internet we value so highly. Yet, voters will continue to support these legislators because they are pretending to be tough on "Big Tech."

For every high-profile decision to terminate users’ accounts and remove their contents, you write that many thousands of other termination and content removal decisions generate minimal attention. Beyond lawsuits filed over these terminations, do social media companies regularly publish data about the terminations, and their reasons? If not, should they?

Some services publish "transparency" reports that provide details about their content moderation operations, including (in some cases) removals or terminations. However, these transparency reports often do not really help us understand what's going on "under the hood." At the same time, enhanced transparency reports may not be more helpful. Each content moderation decision is an individualized decision that depends on context we as outsiders may not see. Getting more stats doesn't really tell us if the Internet services made the "right" decisions. 

According to your research, virtually every media enterprise has adopted "house rules," but the specifics of those rules can vary widely among Internet services. Which social media company's "house rules" do you believe are the best or most complete? How would you improve the version you think is best?  

This isn't answerable in the abstract because house rules don't exist in the abstract. Each service has a different community with different needs, so the real question is: what service has the best house rules for its community. If I'm not part of the community/target audience, then I can't judge it fairly as an outsider. In general, the most active and thriving communities have the "best" house rules because they've found a way to get the most engagement from their users.

Ethics, Faculty, Law, Research, Law, Technology
Illuminate, law, technology, ethics, communication, research
More articles by this author
Follow us on Instagram
Follow us on Flickr
Follow us on Linkedin
Follow us on Vimeo
Follow us on Youtube
Share
Share