Skip to main content
Markkula Center for Applied Ethics

Will Facebook’s News Tab be for ‘the Public’ and the Newsfeed for the ‘Users’?

Facebook Sign

Facebook Sign

Subramaniam Vincent

AP Photo/Ben Margot

Subramaniam Vincent is the director of Journalism & Media Ethics at the Markkula Center for Applied Ethics at Santa Clara University. Views are his own.  

Facebook's News Tab feature—to be rolled out soon—as in the news in August. Emily Bell of Columbia University wrote for Columbia Journalism Review (CJR) about what the News Tab rollout means for the news industry, especially given the history of Facebook’s prior moves. She remarked that, “the slow, forced integration of news into large tech companies continues.” Bell also compared it with Apple News, among many others. 

The comparison to Apple News focuses on two fronts: journalists involved as editorial specialists curating top stories, and platforms paying publishers. While all of this analysis, some of it critical of Facebook’s past, is necessary from the supply-side (the news producers and platform distributors), it does not complete the picture. 

To add more comprehensiveness, we need to ask what does the so-called News Tab mean for the public, i.e., the two billion or so in humanity who will eventually see and use it? For their part, Facebook’s head of news partnerships, Campbell Brown, said on Twitter that the goal was to provide people with a “better news experience.” 

The question is whether News Tab will help the public more seamlessly access credible local and national journalism without stumbling on junk along the way? If so, what are the likely pitfalls ahead? This is the demand-side perspective. 

To me, Facebook’s decision to offer News Tab is an acknowledgment that journalism was always going to be in conflict with the sorting of AI-led personalization and content recommendation systems. There are two facets to this conflict. 

One, news is a public good. Treating news as equal from a user-interest perspective to all other posts (friends, family, hyperpartisan rhetoric and what not) and subjecting it to topics and social personalization was, and is, deeply and ethically problematic. Why? Because it conflated the public interest with ‘user’ interest. Those are two different things, totally. Second, platform media products whose AI systems have tended to treat all types of news-like content, both of journalistic origin and otherwise, as just “fresh content.” Until recent efforts started to bear some fruit, this allowed disinformation and misinformation to have equal-opportunity access to massive resources in the Internet information supply chain. 

To be fair, this conflict was not Facebook’s or Twitter’s invention. All content-based products dove right into this as a takeoff from the utopia of the 2000s on the Internet’s deep promise of the democratization of voice. But the contest between the hard work of journalism and our grandmother’s birthday, much as I’d love to hear about the latter, is most extreme on social platforms as opposed to search or even news-only aggregators.

This is not to say that the news industry is saintly and hence always merited affirmative action from the big private platforms. We have our fair share of blame. For years, the news industry has remained caught up in its own ultra-competitive and sensation-seeking race to the bottom for eyeballs and dollars. Social media design simply amplified those bad tendencies in news production, by mixing the output of journalism with everything else. 

Social media lets us as individual users somehow decide what news we as individuals and in our groups, must know about. The fast thinking-oriented engagement machines (clicks, likes, shares) became a purposeless feedback loop between us as users and the news suppliers (journalists and editors overseen by business models). This socialized feedback loop privileges the “news we want” over the “news we need.” The latter is connected to the common good. But in the context of overabundance of content and frictionless design of media, we are more likely to be clicking and sharing very quickly. This privileges the “news we want”—this is the evidence across much research. Our belief and confirmation biases tend to drive our selections. 

There’s more. To me, the most worrisome move by social media companies in the 2000s was to usurp the word “news” and join it with “feed,” inventing the word “newsfeed.” Currency from the word “news” was used to elevate this grand thing called “feed.” This “newsfeed” has a mix of things from news (from journalism outlets with varying standards of ethics), misinformation to news to toxic disinformation. 

So now, by separating “news” from its “newsfeed,” the world’s leading social media platform is taking a different step. There is some recognition that “news” is different and must be available separately. This is a new opportunity to separate journalistic actors from the rest on social media. 

We do not know how this will pan out. It’s an important step to divorce an unholy union that was never going to bring good to humanity. The fuller implications of a News Tab will only become clear as the feature is rolled out and executed worldwide in various geographies, each with their own cultures of journalism, news industry associations and public perceptions of the work reporters really do. For instance, whether this will actually diminish the power that false and malicious content has with users on Facebook depends on the rest of the design. 

Every industry has a code of ethics that it often uses to explain major decisions, ones of public significance. Facebook, in rolling out the News tab, did not explain in plain language its ethical persuasions for the decision. Campbell Brown tweeted an announcement of sorts, by using a New York Times article to confirm ongoing talks with publishers. There is no press release on newsroom.fb.com as I write this. This, by Facebook’s announcement standards, was meek. 

It would appear that the social media giant is keeping people guessing on many fronts. Election 2020 is nearing, and campaign-related disinformation cycles will start expanding soon. Facebook would do well to answer several questions. For instance: 

1. Will Facebook allow people to add their preferred hyperpartisan sources into the News Tab or complain about the absence of some of their favorite sites? (Apple, Google, SmartNews, all disallow custom source additions. It seems like the answer will be No.)

2. What happens when users share a newslike disinformation article with each other in their normal newsfeed, which is not on the News Tab, and that article goes viral? Would Facebook recommend that users use the News Tab feature to access stories of higher credibility on the same story, i,e., not draw their learnings from low-grade, no-veracity posts? 

3. What happens to the long tail of legitimate local and regional news outlets? Would the News Tab's algorithmic back-end use personalization anyway to sort the long tail for users (I live in place X, so here's local news from place X.)

You and I, and others, will have more such questions. Stay tuned.

Oct 1, 2019
--