Skip to main content
Markkula Center for Applied Ethics

Reducing the Spread of Misinformation Online

word cloud

word cloud

Five relatively easy things to do

Irina Raicu

Irina Raicu is the director of the Internet Ethics program at the Markkula Center for Applied Ethics. Views are her own.

Back in 2013, we asked a number of Silicon Valley pioneers to discuss what they saw as key issues in internet ethics. Charles Geschke (co-founder and former chairman of the board of Adobe, and a member of our center’s advisory board) focused on the need for accuracy and trustworthy information online. He argued, in part, that

it’s really up to the individual to make decisions about where he or she gathers her information and her news; and unless we become much more sophisticated in how we do that, we’re apt to find ourselves reacting to information that is totally inaccurate, and making decisions that could impact both our personal lives, our families’ lives, and, in fact, the life of the country—in terms of political campaigning.

His warning anticipated many of today’s conversations (or raging arguments) about online disinformation and the toll that it has taken—and continues to take—both on individuals and on our political system.

The scope of the issue requires action from a variety of vectors. As researcher Kate Starbird writes in a recent article titled “Disinformation’s Spread: Bots, Trolls, and All of Us,” “disinformation is not as cut-and-dried as most people assume: those behind disinformation campaigns purposely entangle orchestrated action with organic activity. Audiences become willing but unwitting collaborators, helping to achieve campaigners’ goals. This complicates efforts to defend online spaces.” She adds that “[s]olving this will take a level of collaboration across platform designers, policymakers, researchers, technologists and business developers that is, frankly, hard to imagine.”

There is a lot of heavy lifting that needs to be done all of those folks—but solving the problem will also require action from all of us who simply consume and share information via the internet. Luckily, there are some things we can do that are not particularly difficult. I listed five of them in an article that initially appeared in the Santa Clara Magazine; I thought I’d revisit that list here:

  • Don’t share news stories based on the headline alone (without actually reading the linked article).
  • Don’t share in anger. The few seconds you take to vet a story will also serve as a cooling-off period.
  • Before sharing a link, especially if it comes from a source you don’t recognize, go to Snopes.com, Factcheck.org, or Politifact.com, and use those services to check on the accuracy of the linked story.
  • If those sites don’t address the story, Google the headline (or use your favorite other search engine instead). If the story is fake, it’s likely that other articles debunking or questioning it will appear in the search for it, too.
  • If, after those steps (which shouldn’t take very long), you’re still not sure whether a story is true or not, don’t share it. Your family and friends aren’t likely to be permanently deprived of key information by your choice, but the ecosystem may well be improved. This is especially true in light of the phenomenon of “availability cascade,” which, as Wikipedia notes, is a “self-reinforcing process in which a collective belief gains more and more plausibility through its increasing repetition in public discourse.” Even false stories start to gain an aura of credibility if repeated or shared often enough (“I think I heard that before, didn’t I? There must be something to it …”). But, on the Internet, the presence of smoke doesn’t always signal the presence of fire. Sometimes, it’s just smoke and mirrors.

Some of those steps require a bit of time and effort; more importantly, they require some self-restraint. And that restraint is especially important when a breaking story seems to confirm or support precisely what we already believe. As Starbird points out, “we may have trouble seeing the problem when content aligns with our political identities.” She adds that “Perhaps the most dangerous misconception is that disinformation targets only the unsavvy or uneducated, that it works only on ‘others.” (Note, for example, the range of examples of false claims addressed in the August 5th Politifact article.)

We don’t have to allow ourselves to be “unwitting collaborators” in the disinformation war. Self-control is a virtue, and virtues are habits. Today’s internet offers frequent opportunities for us to work on developing this particular one.

Aug 7, 2019
--

Subscribe to Our Blogs

* indicates required
Subscribe me to the following blogs: