Skip to main content
Markkula Center for Applied Ethics

Autocompleted

screenshot

screenshot

"Is Google...?"

Irina Raicu

Irina Raicu is the director of the Internet Ethics program at the Markkula Center for Applied Ethics at Santa Clara University.  Views are her own.

Ever so long ago, back in December of 2016, The Observer published an article titled “Google, Democracy, and The Truth about Internet Search.” In it, journalist Carole Cadwalladr detailed her experience with Google’s “autocomplete” function, which tries to anticipate what you are about to type in the search engine’s query box, based (Google says) on what users search for most.

“Here’s what you don’t want to do late on a Sunday night,” Cadwalladr wrote.

You do not want to type seven letters into Google. That’s all I did. I typed: “a-r-e”. And then “j-e-w-s”. Since 2008, Google has attempted to predict what question you might be asking and offers you a choice. And this is what it did. It offered me a choice of potential questions it thought I might want to ask: “are jews a race?”, “are jews white?”, “are jews christians?”, and finally, “are jews evil?”

Cadwalladr went on to discover that “are women evil?” was the first autocomplete option for “are women…” For Muslims, the first was “are muslims bad?”

The article cited Danny Sullivan, the editor of SearchEngineLand.com, who commented that he had thought Google had “stopped offering autocomplete suggestions for religions in 2011.” Others had thought so as well. It seemed like a good idea.

The day after Cadwalladr’s article was published, The Observer published another article, titled “Google alters search autocomplete to remove ‘are Jews evil’ suggestion.” It noted, however, that only some of the problematic autocomplete results discussed by Cadwalladr had been addressed, and that Google “did not comment on its decision to alter some but not all those raised in the article.”

Last week, I was preparing to guest-teach a Data Ethics class for which the students had been asked to read Cadwalladr’s piece. In anticipation, I decided to take a look at the current state of Google’s autocomplete. I started by googling “are jews,” fully expecting to get no autocomplete options. I was wrong. Autocomplete kicked in, offering me four options: “are jews a race,” “are jews white,” “are jews christians,” and “are jews circumcised.” Intrigued, I tried a different religion: Zoroastrians. Here’s what I got: “are zoroastrians muslims,” “are zoroastrians monotheists,” “are zoroastrians vegetarian,” and “are zoroastrians persecuted in Iran.”

I then tried another: Christians. For that, however, Google offered no autocomplete options.

Was nobody on the Internet googling for information about Christians?! How does Google decide for what topics to disallow autocomplete? And is it ethical for it to present autocomplete suggestions for some religions but not others?

As Cadwalladr wrote, “Google is knowledge. It’s where you go to find things out.” She also noted, though, that “are Jews evil” was “not a question [she’d] ever thought of asking.” So, via autocomplete, Google is a provider of questions, as well as answers. And Google makes decisions about both.

In anticipation of class discussion, I tried a few more queries. “Are students…” “Are Californians…” “Are women” yielded two autocomplete options: “Are women required to register for the draft” and “Are women equal to men.” I clicked on that second one, and the first result was an article titled “7 Things to Know If You Think Women Are Equal to Men.” It was about inequalities that persist in the U.S. The fourth result on my list was titled “Why We Need to Stop Telling Women They’re Equal to Men.” It was an article from the Indian edition of Huffington Post.

Sitting in my office in California, I wondered about young girls around the world who might be clicking on the autocompleted question “Are women equal to men,” searching for answers. The autocomplete suggestion was not offensive in itself, but the list of results that it led to was problematic. It suggested that the search engine interprets the question in a particular way (“Are women measurably treated the same way as men in certain societies?”)—but the question could be interpreted in a number of ways, including “Are women inherently equal to men?”

In gray-on-white, tiny, italicized font, below and to the right side of the search box, Google now offers an option: “Report inappropriate predictions.” I was tempted to. But that feature seems to be aimed at addressing offensive autocomplete predictions like “are muslims bad,” not other kinds of predictive inappropriateness.

Today, I tried again to google “Are christians…” Today, autocomplete did offer one option: “Are christians allowed to eat pork.”

And so, of course, I had to try “Is Google…” Autocomplete offered me four options: “Is google down,” “is google making us stupid,” “is google a number,” and “is google drive down.” Questions reflective of both reliance and concern.

It is a heavy burden to be perceived as the provider of knowledge and the oracle of truth. Google has put itself in that position. Is Google equal to the task?

May 9, 2017
--

Subscribe to Our Blogs

* indicates required
Subscribe me to the following blogs: