Skip to main content
Markkula Center for Applied Ethics

On Internet-Connected Toys and Human Flourishing

face of Hello Barbie doll

face of Hello Barbie doll

Hello, Privacy

Irina Raicu

Irina Raicu is the director of the Internet Ethics program at the Markkula Center for Applied Ethics, Santa Clara University.  Views are her own.

Last week, as part of Loyola University’s seventh annual Symposium on Digital Ethics, I gave a brief talk about the “Internet of Toys.” The title came from one of my earlier blog posts—“Et tu, Barbie?”—but in the talk I tried to focus more sharply on the ethical implications of internet-connected toys.

In the process, I described an in-class activity that I came up with a few months ago, when I was invited to guest-teach a Software Engineering Ethics class here at Santa Clara University.

The course is directed at graduate students. This particular class session was to focus on privacy. So, after a brief warm-up exercise intended to clarify my view of informational privacy (that it’s not about secrecy but about having some measure of control over which information we disclose to whom, when, how, in what context, etc.), I asked the students to break up into small groups.

And then I asked them to come up with the most privacy-invasive device or tool they could think of.

Here are the questions I wrote on the board to help them in that effort:

  1. Where does the device “live” or reside?
  2. What kind of information does it collect?
  3. From/about whom?
  4. What does it do with the collected data?
  5. Who has access to the data collected?
  6. Where is the data stored? Is it encrypted?
  7. How long is the data stored for?
  8. What purpose does the device aim (claim) to serve? Why?

(The questions are a work in progress. That last one is designed to get at the benefits v. harms analysis, but I’m not sure it quite does the job.)

After a while, I had a few of the groups present their ideas. The first group described a product that would be placed in a bathroom. And would collect photographs. Of… financial data! (At that point, I was, um, relieved to hear that).

Other groups described products that collected information about particularly vulnerable populations. Of course, the data collected would be publicly accessible, or sold to whoever was willing to pay for it. And as for how long it would be stored? Given the project’s stated goal, the almost unanimous answer was “forever.”

Then we talked about “Hello, Barbie,” the actual internet-connected toy that is currently on the shelves—and in some children’s bedrooms. We went through the questions one by one. It was an interesting conversation (especially since more than half of the students in the class were women, and some had children of their own).

Needless to say, the point of the exercise is to make future software engineers stop and think if (or when) they find themselves answering such questions in privacy-violating ways as they design actual products and services. And internet-connected toys, targeted, as they are, at young children, should be designed with even more care than most products. Such toys can undermine children’s privacy, and in the process undermine their developing autonomy and sense of trust. If a child speaks to his/her doll or teddy bear or dinosaur and then finds out that the “conversation” was recorded and transmitted to the child’s parents (not to mention uploaded to the “cloud” to be used by various entities for various purposes, which might be hard to explain to a young child), the child might well feel betrayed. (See reactions cited in a study by researchers from the University of Washington: “Toys that Listen: A Study of Parents, Children, and Internet-Connected Toys.”)

During the talk at Loyola, I pointed out, as well, that the ethical analysis of such products doesn’t end at the design, manufacturing, or even the marketing stage. As my colleague Shannon Vallor puts it, software is “a moving ethical target; features change, users change, platforms change, contexts change. A benign [product] can become ethically problematic with a single bug fix or update.” Vallor, who frequently teaches ethics in classes directed at engineers, also points out that “ethical issues cannot be fully anticipated; ethics is about responsiveness, not just foresight & prediction.”

Foresight and responsiveness: Internet-connected toys require a significant ethical commitment on the part of their creators, if they are to contribute to, rather than undermine, human flourishing.

Additional resources for teaching about—or designing, building, and marketing—internet-connected toys:

The Markkula Center for Applied Ethics’ “Framework for Ethical Decision Making”

“The Goodbye Fears Monster”: An Ethics Case Study

Photo by portal gda, cropped, used under a Creative Commons license.

Oct 20, 2017
--

Subscribe to Our Blogs

* indicates required
Subscribe me to the following blogs: