A different kind of Facebook privacy violation
Irina Raicu is the director of the Internet Ethics program at the Markkula Center for Applied Ethics at Santa Clara University. Views are her own.
Last Friday, The Verge (and several other publications) reported on a leaked internal Facebook memo and the resulting internal conversations that followed among Facebook employees. The memo was a 2016 post by Andrew “Boz” Bosworth, a Facebook VP. Buzzfeed reports that Bosworth had “published the post to Facebook for employees’ eyes only a day after the shooting death of a Chicago man was captured on Facebook Live…”
The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is *de facto* good. It is perhaps the only area where the metrics do tell the true story as far as we are concerned. …
We connect people. Period. That’s why all the work we do in growth is justified. All the questionable contact importing practices. All the subtle language that helps people stay searchable by friends. All of the work we do to bring more communication in. The work we will likely have to do in China some day. All of it.
There is much to unpack there. “[P]erhaps the only area where the metrics do tell the true story…” “All the questionable contact importing practices”… the “subtle language…” the work done “to bring more communication in”…
But my focus here is not on those things. It’s on the internal reaction among some Facebook employees to the leaking of the memo, and to Bosworth’s subsequent deletion of that memo.
In The Verge, Casey Newton reports that
[s]everal employees suggested Facebook attempt to screen employees for a high degree of ‘integrity’ during the hiring process. ‘Although we all subconsciously look for signal on integrity in interviews, should we consider whether this needs to be formalized in the interview process?’ one wrote.
Wrote another: ‘This is so disappointing, wonder if there is a way to hire for integrity. We are probably focusing on the intelligence part and getting smart people here who lack a moral compass and loyalty.’
That’s right: “loyalty.” Because, according to Newton, “Dozens of employees criticized the unknown leakers at the company.” The moral compass whose absence was bemoaned was that of the leaker.
And why? Because a posting that had been intended to be shared with a particular audience, in a particular context, as part of a particular conversation, had suddenly been made available to a far different/broader/unintended audience, devoid of context. And the person who had drafted the post had had no input and no control on that unauthorized “sharing.” In other words, this was a privacy violation. And, in its aftermath (reports the New York Times), “[m]any [other Facebook employees] are also concerned over what might leak next and are deleting old comments or messages that might come across as controversial or newsworthy…”
Welcome, Facebook employees! Welcome to the ongoing conversation about context collapse, about semi-private posts being made public, about “likes” that seemed like private acts being publicized to an unintended audience, about, say, students being outed by design choices that turn the joining of a group into a public announcement.
Welcome, too, to the debate about the right of erasure (unfortunately and inaccurately better known as “the right to be forgotten”). And, sadly, welcome to self-censorship as one of the few means of control left to us: in one 2016 study, researchers found that the young people they interviewed “had adopted different approaches to protecting privacy, [but] the only widely agreed-upon technique was self-censoring, or leaving information off the Internet entirely.” The researchers added that “as users understand their lack of control over their information [online], they retreat in certain ways when it comes to sharing.”
After the report on his 2016 post, Bosworth took it down and posted a new memo, bemoaning, among other things, the loss of the earlier conversation. “This is the very real cost of leaks,” he wrote. “If we have to live in fear that even our bad ideas will be exposed then we won’t explore them or understand them as such… Conversations go underground or don’t happen at all.” What he didn’t acknowledge is that this is also the very real cost of various choices made by Facebook itself.
If the goal is to connect people, no matter what, then the question of what those “connections” mean is irrelevant. The platform is not there to help you communicate (with friends, or fellow employees); it’s there to “connect” you. The only metric that matters is the one that renders you nothing but a dot in a node, connected to other dots who also don’t matter.
To be fair, some of the internal comments on Bosworth’s post show that some Facebook employees do get the irony of this crisis. Here is one that was quoted in The Verge:
It’s interesting to note that this discussion is about leaks pushing us to be more cognizant of our sharing decisions. … The non-employee Facebook user base is also experiencing a similar shift,… realizing that social media posts that were shared broadly and are searchable forever can become a huge liability today. A key difference between the outside discussion and the internal discussion is that the outside blames the Facebook product for nudging people to make those broad sharing decisions years ago, whereas internally the focus is entirely on employees.
Image by Master OSM 2011, cropped, used under a Creative Commons license.