- Ethics Home Page
- Focus Areas
- Contact Us
- Site Index
The Bewitching Miss Julia
Life on the Screen: Identity in the Age of the Internet
By Sherry Turkle
By Tim Healy
Life on the Screen, Sherry Turkle's new book about the Internet, describes this online encounter between two players in a virtual environment: A young man named Barry meets Julia, whom one critic has called "a hockey-loving ex-librarian with an attitude." Barry decides to woo Julia, attempting to talk her into entering his virtual apartment, presumably with no good in mind. One snatch of typed conversation goes like this:
Barry says, "Hello, how are you?"
Barry and Julia are conversing in a multiple-user domain, or MUD. To create a MUD, a programmer writes a basic script describing some environment perhaps a spaceship, enchanted island, or old mansion. MUD players then sign on through the Internet, adopting any identity and characteristics they choose, building virtual rooms or spaces described in words on the screen.
The screen identity of a player may have little or no relation to his or her real self. Players often change gender, age, appearance, level of wealth, etc. In fact, as Barry eventually learns, a MUD character may have no real self at all. The bewitching Miss Julia continues to resist his advances for several days until, at last, Barry learns that he has not been wooing a real woman at all.
Julia is a bot, a computer program written to respond reasonably in conversations. She began her existence in the mind of some programmer, but today she independently roams the halls of her MUD, virtually captivating young men.
Encounters like Barry and Julia's are at the heart of Turkle's examination of "identity in the age of the Internet," her book's subtitle. She has been exploring such matters for 20 years, most recently as a professor of sociology at the Massachusetts Institute of Technology, where she teaches about the relations between humans and computers. Her research prompts her to ask, "Is the real self always the naturally occurring one? Is the real self always the one in the physical world? As more and more real business gets done in cyberspace, could the real self be the one who functions best in that world?"
These questions have important moral ramifications. What are the ethics of swapping identities? Does someone who signs onto the Internet under a different persona violate the virtues of honesty, trustworthiness, and integrity, or do the rules of the game allow such tinkering?
You might conclude that all is fair among consenting players. In fact, such games can be seen as a way of sparking moral imagination. As Turkle puts it, "We can use it as space for growth." By trying out new personae, we can, she argues, learn more about our true selves and the nature of artifice. Then we can return from cyberspace enlarged by our experience. "Virtuality need not be a prison," she writes. "It can be the raft, the ladder, the transitional space, the moratorium, that is discarded after reaching greater freedom."
But suppose, for example, that you sign on as an aggressive male in search of what is known on MUDs as TinySex; that is, simulated sex carried on through conversations. If it is true that 90 percent of sex is in the mind, then such an encounter might be almost as emotional as the real thing.
Now suppose that the player behind the identity you meet is a 10-year-old child. Should you enter into a highly emotional encounter with such a person?
Playing in a MUD may be entertaining, stimulating, and growth-enhancing for a healthy adult. But for an immature individual, it might also be frightening or traumatic. If there is a chance our actions might cause harm to another, are we morally obligated to refrain from playing the game?
And what if the player behind the identity you meet is, like Julia, a bot? When one player learned that Julia was just a computer program, she reported feeling "shallow, void, hollow, superficial, fake, out of control of the situation." What is the programmer's obligation to reveal the bot's identity - or lack of it?
This question becomes more critical when you realize that "talking programs" are not just games. There are, for example, programs designed to present themselves as psychoanalysts. One early program of this type, ELIZA, is responsible for something called the ELIZA effect, which is a generally observed tendency for people to treat programs that respond to them as if the programs had more intelligence than they really do. People tend to project their own complexity onto the computer.
Some therapists have experimented with having computers do preliminary work with patients, and some patients have found the programs helpful. While the phenomenon of cyberpsychology is not in itself troubling, given the current state of the technology, it is important for therapists to be upfront about the fact that patients are consulting with a computer program.
But, as programs become more sophisticated, Turkle raises the startling possibility that they may have some measure of intelligence or personlike life. A number of decades ago, the brilliant computer scientist Alan Turing proposed a test to determine whether a computer was intelligent. He suggested that a person of average intelligence be allowed to communicate through a keyboard with an unknown in another room, asking any question the player wished. If the person could not tell that the unknown was a computer, the computer was to be called intelligent. One observer suggests that it's not clear if Julia passed the Turing test or Barry failed it.
In any event, modern computer programs have come close enough to genuine intelligence for some thinkers to propose we take them seriously as fellow creatures. Christopher Langton, who organized the first Conference on Artificial Life in 1987, believes that it is not too soon to begin considering the rights of machines that almost think.
Most people, of course, are more concerned with human rights and how they may be affected by technology. The Internet, for example, is evolving with incredible speed and without central control, raising the specter of all sorts of abuses.
Many have responded to this pace with calls to regulate the Net for pornography, libel, racist language, and so forth. Yet it is important to note that lack of central authority is hardly unique to the Internet. We are enveloped in human systemslanguage, medicine, politicsthat have evolved with little or no overall control. As with all parts of our fragmented lives, in the end, we have to rely on individual responsibility and integrity to guide our "life on the screen."
Tim Healy is the Thomas J. Bannan Professor of Electrical Engineering and a member of the Center Steering Committee.
|Issues in Ethics - V. 7, N. 2 Spring 1996|
|Finding Common Ground|
|Who Counts? Images Shape Our Moral Community|
|on the one hand|
|Immigration: Is Exclusion Just?|
|A Higher Authority|
|Martin Cook Replies|
|a good read|
|The Bewitching Miss Julia|
|putting ethics to work|
|Winery With A Mission|
|a case in point|
|The Case of the Long-Distance Cancer Treatment|
|Responses to the Case of Maria Elena|
|scholars at work|
|Mark Ravizza, S.J.: Moral Disequilibrium|
|June Carbone: What We Owe Our Children|
|at the center|
|From Conflict to Cooperation|
|Markkula Center Hits the Information Superhighway|
|issues in ethics tools|