Skip to main content
Markkula Center for Applied Ethics

Don’t Just “Do Something.” Don’t Make Things Worse.

Drone and moon

Drone and moon

Irina Raicu

"Drone and Moon" by Don McCullough is licensed under CC BY 2.0.

Irina Raicu (@iEthics) is director of Internet Ethics at the Markkula Center for Applied Ethics. Views are her own.

 

In the wake of more mass shootings, especially the now awfully familiar shootings in schools, we all feel the need to respond—to do whatever is possible to prevent them, or limit them. School administrators, city governments, and law makers are all pressured to do something. And often, especially in locations where conversations about gun control measures have been deemed a non-starter, they are pitched technical “solutions.”

The most recent example of this, in the aftermath of the tragedy in Uvalde, came from the CEO of Axon—the company best known as a maker of tasers and body cameras. Rick Smith explained in interviews that, as a parent shocked by yet another school shooting, he felt an obligation to offer what he saw as a possible means of defense. His proposal involved small drones armed with tasers, flying among classrooms, operated by professional pilots, triggered by an app that would be used by people within the schools that deployed this. As one article explained, “[s]ince drones can’t go through walls, and since it would be terrifically expensive to install a Taser drone in every lockable room, Smith imagines school buildings that might include ‘small portals, effectively a slot in the wall or on top of the door.’” Aside from the fact that this would require the retrofitting of school buildings, the journalist interviewing Smith wondered about “the potential psychological impact of being reminded on a regular basis that your classroom contains, at all times, a flying robotic Taser that can shock you into submission.”

More commonly proposed, and actually widely implemented, are tools that scour social media for posts that might precede violent actions. As Tom Simonite reported in Wired in 2018, schools were already implementing software that flagged (via very imperfect AI) public posts on various platforms “that may suggest conflict or violence, and tag or mention district schools or communities.” The CEO of one company named Social Sentinel described a “booming” business following the shooting at Marjory Stoneman Douglas High School. According to the Dallas Morning News, the Uvalde school district had purchased Social Sentinel’s services—though it’s not clear whether or not it was in use at the time of the shooting there. 

As it turns out, numerous school districts had purchased such services and deployed them for some time before cancelling their contracts; Dallas Morning News reporter Ari Sen noted that, in his reporting about the use of such tools in Texas, a comment that he had gotten repeatedly, “not only from school districts but from colleges, is that 90 to 99 percent of the stuff that they were getting from the Social Sentinel service was false alerts.” False alerts, of course, have consequences—for the people whose posts are flagged, as well as for the recipients of the alerts, who might become inured and ignore real threats.

In an effort to respond to anguished citizens, and to assuage their own anguish, people who hold the power to make a difference seem to be turning to “solutions” that make things worse. Surveilling students, monitoring their social media posts, dreaming of drones in the classroom, are indicators of desperation mixed with disinformation about the effectiveness of various tech tools. A shooter might not post anything on social media. A shooter might shoot through a window, not bothering to enter a school building. A shooter might act near, not within a “hardened” school. We will never be able to harden all our public places, and monitoring of social media posts, even if it involved fewer false flags (and even greater violations of privacy), would miss much.

The reporter who interviewed Rick Smith for Slate, Faine Greenwood, made a startling but accurate comment: “In 2022, a future where we stick Taser drones in every classroom feels much more attainable than one where the U.S. passes effective gun control laws.” That should not be a future we’re willing to even experiment with. While some tech tools might indeed be helpful, they should be considered in conjunction with, not as replacement for, regulations that address the role of guns. We need to push for laws that restrict access to guns, especially to the kinds of weapons that can’t be justified by any need in civilian life, and which have caused the greatest carnage in mass shootings; at the same time, we need to push against the deployment of ineffectual technology tools that sap much needed school funds while making things worse for the students they claim to protect.

 

Jun 14, 2022
--