How might news platforms and products ensure that ethical journalism on chronic issues is not drowned out by the noise of runaway political news cycles? In doing so, might it also have the positive externality that news amplifying disinformation does not rise up in the feeds? These questions are the focus of the Keeping Issues Alive initiative at the Journalism and Media Ethics program at the Markkula Center for Applied Ethics.
During the second half of 2021, the program initiated a set of meetings with stakeholders from the news platforms (social media and search companies), AI Ethics, and Journalism fields. Our goal was to draw up a set of principles, definitions, and recommendations for ethical news distribution in this context. News products cover the broad gamut of social media feeds, news aggregators, and search. We held two Chatham House Rule meetings with representation from YouTube/Google, TikTok, Twitter, Partnership on AI, AI For the People, Berkeley Center for Human-Compatible AI (CHAI), the Harvard Tech & Public Purpose Program, The Factual, and a couple of independent experts.
The discussion in 2021
Our discussions helped draft a frame for news distribution ethics at platforms and news aggregators. A frame helps set the scope for decision-making that could benefit from ethics analysis and input. We wanted to identify the different types of input – principles, recommendations, guidelines – that stakeholders could self-identify as giving them the best leverage to make change.
We looked at decision making in news distribution (at tech platforms and aggregators) including the following:
- Decisions about the supply-side (news publishers and User Generated Content (UGC)
- Decisions about the demand-side (human behavior and news)
- Decisions independent of supply and demand (platform agency and policy about news)
Separating news distribution decision making this way helped us discover and focus on specific areas shared across stakeholders. We also discussed the following:
- The distinctions in meaning between elevation and amplification.
- Where platforms feel definitions and global standards are needed.
- A set of low-hanging-fruit decisions for publishers and platforms today where perhaps consensus on externally settled definitions standards is not needed.
- Whether Citizen Assemblies can resolve the "who decides" problem that lurks in these areas as an alternative to academic/industry consortia.
Input from journalism reformers
The program also spoke to people at three journalism reform organizations: News CoLab (Arizona State University, Cronkite School of Journalism and Mass Communication), the Center for Community Media (Craig Newmark Graduate School of Journalism, CUNY), and Resolve Philly. They gave us the following input:
- "We can already identify best practices in (ethical) coverage through human review. The hallmarks are: Human-centered language, thematic rather than episodic framing, depth and context over sensationalism and clickbait. Some newsrooms are often convening and surveying audiences, implementing changes, and sharing their work as ‘meta-coverage’ with readers."
- Transferring this knowledge to platforms to use for elevation in algorithms is an opportunity in its own right and may help interconnect the cognitive surplus that exists in human tracking of ethical journalism with the news-AI world.
NDE roundtable participants
Aviv Ovadya, Harvard-Belfer Fellow, Harvard Technology and Public Purpose Project
Claire Leibowicz, Head, AI and Media Integrity, Partnership on AI
Arjun Moorthy, CEO, The Factual
Mutale Nkonde, CEO, AI For The People
Geoff Samek, Product Manager, YouTube
Bill Skeet, Chief Product Officer, NOBL Media
Jonathan Stray, Visiting Scholar, Berkeley Center for Human-Compatible AI (CHAI)
Tara Wadhwa, Policy Director, US Trust & Safety, Tiktok
Jennifer Wilson, Curation Standards Lead, Twitter
New members, Spring’22
Nick Diakopoulos, Associate Professor of Communication and Computer Science, Northwestern University
Cheryl Thompson-Morton, Black Media Initiative Director, Center for Community Media, City University of New York
Aubrey Nagle, Project Editor, Reframe, Resolve Philly
Kristy Roschke, Managing Director, News Co/Lab, Walter Cronkite School of Journalism and Mass Communication
Connie Moon Sehat, Researcher at Large, Hacks/Hackers and Director, News Quality Initiative
Jen Granito, Product Manager, Google News
[Irina Raicu, director, Internet Ethics at the Markkula Center, Anita Varma, formerly assistant director, Journalism and Media Ethics, and Adriana Stephan, formerly Program Lead, AI & Media Integrity, Partnership on AI, also contributed.]
Outcomes as we enter 2022
- “News Distribution Ethics" is a legitimate and distinctive goal. Just as "Content Moderation Policy" is an accepted frame around which advocacy, rules, declarations, stakeholdership and transparency is being pushed and negotiated with the platforms on UGC, "News Distribution Ethics" is a legitimate and distinctive umbrella frame to build targeted guidance to bolster support for ethical and pro-democracy journalism on platform feeds.
- We plan to extend the meetings and expand participants this Winter and Spring. We will publish a draft document at the conclusion with guidance for for product-AI-news-curations people, and for further down the line stakeholder efforts that can fan out from our work.
For instance, we plan to decide priorities and address the following:
- Who is or isn’t a News Publisher?
- What merits elevation?
- What is an ethics-based definition of “Newsworthy”?
Stay tuned for more this year.
- Can News Distribution Improve? Journalism Panels Consider the Question, News Quality Initiative
- “Our Opinion: Recommendations for Publishing Opinion Journalism on Digital Platforms”, NewsQ technical recommendations paper co-authored with Patricia Lopez, Opinions Editor, Minneapolis Star Tribune. (Inputs from David Agraz, Leona Allen Ford, Jon Allsop, Rochelle Riley, and Rebecca Traister.)