- Ethics Home Page
- About the Center
- Focus Areas
- Contact Us
- Site Index
The Problem: "Cybersmut"
The Internet, including e-mail, Usenet, and the World Wide Web, provides those connected to it an unprecedented amount of information at one location: the home, office, library or school computer. The information available on the Internet is seemingly limitless-in quantity and variety-and this lack of boundaries is possible because anyone, from a schoolchild to a corporate CEO, can post anything in cyberpublic view.
As they have everywhere else, however, those interested in the "shadier" side of life-pornography, gambling, hate speech, and so on-have set up shop on the Internet. This has parents, who would otherwise love to grant their children access to this information resource, concerned. Many would like to put an end to "cybersmut," or at least a strong wall between it and children's eyes. And parents aren't the only ones worried about the Internet's boundlessness. Employers are starting to worry about their employees use of the Internet during business hours, and lawmakers-here and abroad-worry about Internet crimes, including gambling, child pornography, and the solicitation of minors for IRL ("in real-life") encounters with pedophiles and child molesters.
The problems generated by the existence and use of the Internet are in many ways similar to those that developed with the introduction of new technologies in the past. A system of values that had moral force pre-Internet is clashing with a new value system, established for and by a "cyberculture." The same thing happened with the invention of the printing press, the radio, the telephone, the television. And, as has happened in the past, we can expect to see ethical systems that ignore this new technology becoming incredibly particularized, or vanishing altogether. Yet the Internet is not just one more technological wonder in the history of technological wonders. More than any other device in the history of humanity, the Internet makes available information and global communication. Furthermore, through the use of hypertext linking, the user has a great deal of control over the information he or she gathers. Considering the ease with which one can "publish" something of one's own, in addition to the abundance of sites and hypertext links, the Internet is perhaps the most diverse user-controlled information and communication resource ever. According to Jerry Berman and Daniel J. Weitzner, such a resource offers "...a real possibility for a new, less-restrictive approach to achieving long-standing First Amendment diversity goals" (1635). For Berman and Weitzner, as well as many Internet activists, the Internet is the virtual embodiment of the democratic ideal of free speech. This previously unattained freedom, however, is in direct conflict with people who, for whatever reasons, desire limits on access to the medium-for children, employees, citizens, etc. While free speech activists and "Netizens" are the most vocal online, the tables are turned IRL.
Internet Blocking: The First Response
The initial response to this clashing of paradigms was a technological one. Concerned individuals demanded a means of controlling what the Internet brought into their homes, and, predictably, various businesses provided one: Internet blocking/filtering software. With names like "NetNanny," "CyberSitter," and "KinderGuard," these products marketed themselves as convenient ways to keep pornography, pedophiles, and other objectionables away from children. Such products also depict themselves as anti-censorship, and, in general they are viewed as such, leaving most free speech activists with little to criticize. Common descriptions of these products include:
Yet Internet blocking is not all it claims to be-not that it's entirely their fault. What they are expected to do is nearly impossible. Although various surveys and reviews of the different blocking systems have been done, the results contradict one another, and very little conclusive evidence for or against particular systems has been found. What is evident from every review is that all Internet blocking mechanisms-even the very best-fail to block sites accurately at least some of the time.
The earlier blocking and filtering software fell (and some continue to fall) loosely into two groups: (a) services that blocked sites containing a word or words considered obscene or evidencing sexually explicit or otherwise objectionable content, and (b) services that had persons exploring and blocking sites individually. Although the former allows users to access far more sites than the latter (as it does not maintain a list of sites, but searches all sites for the "improper" words and word-strings), it rarely works very well. In one popular system, all sites containing the word "breast" were blocked, including those dealing with breast cancer (this has since been corrected). In a more recent experiment with SurfWatch, often reputed to be one of the best blocking software, one was able to view graphically-explicit sexual fetish sites, while a New York Times article on Internet gambling was blocked. Both blocking errors are presumably the result of word or word-string searches. The latter service type, employing actual persons to sort and review sites, can only get to a fraction of the sites on the Web. Some services, such as the aforementioned SurfWatch and KinderGuard, employ persons to review and block web sites individually (b), using words and word-strings to filter sites they have not reviewed (a). This combination of the two original blocking methods appears to be superior to either method alone. It does not resolve the problems with the two methods; it merely reduces the likelihood of access to objectionable sites while maintaining something of the boundlessness of the Internet. Many blocking software producers are willing to acknowledge this much. Jay Friendland, co-founder of SurfWatch, admits, "It's part of a solution. It's not the complete solution." (Nelson)
Legislation: The Second Response
A quick public response to the incompleteness of early blocking techniques was legislative. In the United States, there were municipal and state regulations on the local use of the Internet, including attempts at cyberspace decency laws in New York, Oklahoma, and Virginia (only the New York law has yet been challenged). Yet the greatest legislative breakthrough was the passage of the Telecommunications Decency Act of 1996 (CDA). The CDA mandates that anyone who, using a telecommunications device,
The law was passed to keep cybersmut and its nasty electronic ilk from inadvertently making contact with children. Internet and free speech activists were immediately up in arms, and it did not take long for a three-judge federal court in Philadelphia to rule that portions of the CDA, such as the passage cited above, "...trampled on the First Amendment and amounted to illegal government censorship" (Brekke). This ruling successfully prevented the implementation of the CDA until the Supreme Court heard the case (Reno v. ACLU) in March 1997. On June 26, 1997, a date few free speech activists will forget, the Supreme Court decided in favor of the Philadelphia court's ruling, declaring important sections of the law unconstitutional. The Court wrote that, "...the many ambiguities concerning the scope of its coverage render it problematic for First Amendment purposes" ("Supreme Court CDA Ruling"). The Court further suggested that parents who wish to regulate their children's Internet access utilize Internet blocking software, implicitly affirming the belief in the functionality of a technical solution. Supporters of the fallen CDA have promised to draft a new version, with the hope of clearing up some of the ambiguities the Supreme Court found problematic. It does not seem, however, that even a revised CDA will be able to accomplish its goals through legislation without becoming some form of government censorship. In the United States, it does not look hopeful for legislated regulation of the Internet, but lawmakers continue to try. The White House Internet Decency Summit of July 16, 1997 made plans for a Families Online Summit in October 1997. Both summits were established to explore the role of government in solving the problem of cybersmut.
If the U.S. represents one extreme in the Internet regulation debate, countries such as China and Singapore represent the other. Both nations have established incredibly strict Internet content policies, and both attempt to restrict access to more than cybersmut, confirming the fears of many opponents of Internet censorship in the United States. China, in addition to pornography, blocks access to certain news, human rights, and politics sights to all of its users. Singapore's regulatory agency, the Singapore Broadcasting Authority, blocks access to "areas which may undermine public morals, political stability or religious harmony," registering and regulating all sites dealing with politics or religion ("Countries face..."). Commercial sites in France and Quebec are forbidden by law to advertise, promote and sell products in languages other than French, unless a French version is also available. And Germany recently passed the world's first cyberlaw, holding Internet service providers (ISPs) partially responsible for providing cyberpublic access to sites containing illegal content, such as pornography and hate speech ("Germany passes..."). In fact, the popular ISP Compuserve has been accused of knowlingly providing access to child pornography to German subscribers (Schwadron). Examples such as these seem to indicate the dangers of Internet censorship, creating a "slippery slope" down which even U.S. Legislation could fall. Other freedoms traditionally protected in the U.S., such as the freedom to express publicly one's political and religious views, could then be in danger on the Internet.
It is not yet clear how governments that elect to legislate Internet regulations will go about enforcing them. With the number of sites on the Internet growing every day, type (b) blocking systems are unable to keep up, and many of them incorporate at least some type (a) filtering strategies. Yet no existing type (a) technology is able to filter the Internet accurately and efficiently, forcing those still interested in regulating Internet content in their homes, offices or schools to turn to a different way of blocking the Internet.
The PICS Standard: The Third Response
In 1995, in response to the increasing drive for Internet legislation and the formation of the CDA, the World Wide Web Consortium (W3C) put into motion the development of a ratings system for the Internet. The goal was, as in earlier Internet blocking software, a non-governmental means of regulating Internet content, and the result was the Platform for Internet Content Selection (PICS). PICS is a computer language that enables Internet browsers and search engines to understand Internet rating systems. Sites are self-rated in this language, which is becoming the new industry standard, granting any system configured to read PICS access to the content ratings information. Furthermore, the PICS standard is not limited to particular existing browsers or blocking software, because the ratings information is not located within the product or service (as in other forms of Internet blocking) but encoded within the site itself. This leaves room for private organizations of all sorts to create their own products to block sites objectionable to those sharing their particular values. As one MSNBC article puts it, "The Simon Wiesenthal Center could set up a system to filter out anti-Semitic sites; Islamic groups could rate sites according to their appropriateness for Muslims; the Christian Coalition or People for the American Way could rank Web sites by their own visions of family values" (Boyle, "Internet watchdogs..."). To many people trying to walk the narrow line between censorship and child safety on the Internet, PICS seems like a miracle.
Almost everyone sees PICS as the system with the greatest potential for actually keeping objectionable material out of one's Internet adventures. Yet relatively few sites are presently rated through PICS, and for those that are, relatively few products presently recognize PICS. At the time of this writing, only a handful of products and servicesÑMicrosoft's Internet Explorer, Compuserve and the blocking software CyberPatrol, SafeSearch, SurfWatch and SafeSurf incorporate the W3C PICS standard. Netscape promises to incorporate PICS in a later version, but does not use it now.
It is still unclear exactly how many sites presently rate themselves with PICS, and there is no way of telling how many sites intend to do so. Less than nine of every thousand Internet documents (.87%) are believed to be rated ("PICS Scoreboard"). It is questionable whether certain sites-particularly those that people interested in Internet blocking wish to see blocked-will encode ratings at all. PICS ratings are not mandated by law, and this causes some parents and child advocates concern about the implementation of PICS as the industry standard. Proponents of PICS argue, however, that Internet ratings will follow the same path as film ratings. While the film rating system is voluntary-a film is not required to have a rating-most theaters will not show unrated films, and most film makers want their films shown in most theaters; thus they rate. Internet site designers and webmasters, the argument concludes, will follow suit in order to be accessible to most browsers and blocking software. Although a few Internet search engines-Yahoo!, Excite, and Lycos-have pledged to seek content ratings from all of the sites they register, most Internet filters do not presently block access to unrated sites, and probably cannot be expected to do so until there are a fair number of sites that incorporate PICS ratings. It is a circle the way out of which cannot be foreseen.
The Solution: Neither Netiquette Nor Nethics
Internet blocking software and cyberlegislation are attempts to ebb the flow of cybersmut by making it impossible for such material to be accessed. One method tries to make it literally impossible to access objectionable information; the other tries to make its distribution a punishable crime. Neither has demonstrated its effectiveness to date. Recognizing this, some Netizens have taken it upon themselves to try a different route. Rather than forcibly removing material from Internet access, some wish to develop and adopt codes of Internet etiquette ("Netiquette") or ethics ("Nethics"). Such codes would, presumably, encourage more responsible behavior with regards to the availability of cybersmut to children, the use of the Internet for personal reasons on corporate time, and so on. Cyberspace, under this view, is understood as a community separate from the communities in which users actually live. Just as actual communities have different standards and ethical codes, so should the virtual community.
And, in fact, some sort of code of Netiquette does exist, at least within various cybersubcultures. Some of these systems of behavior have been codified, and some interesting sociological work has been done on the development of communities in cyberspace (Jones). When examined, however, few Netiquette codes deal explicitly with ethical issues, and those that do deal primarily with issues of privacy and plagiarism. Yet were they to deal more explicitly with issues of children's access to objectionable material, and were they to establish standards of behavior that forbade the intentional display of such materials in children's view, the problem would not stop. Just as there are those who choose to deviate from actual codes of etiquette and ethics, there are those who choose to deviate from virtual codes. In short, given that technical and legislative methods do not work, and Netiquette and Nethics, like etiquette and ethics, will continue to be transgressed, the problem will not go away. For the foreseeable future, cybersmut, and children's access to it, is here to stay.
In response to this fact, the National Center for Missing and Exploited Children (NCMEC) offers to parents, "The fact that crimes are being committed online...is not a reason to avoid using these services" (Child Safety...). Rather, the NCMEC maintains that, fundamentally, parenting a child in cyberspace requires much of the same technique, time, and involvement that parenting a child IRL does. They recommend that parents meet their children's virtual friends, just as they would meet their actual friends. They encourage open dialogue between parents and children about objectionable material accidentally encountered on the Internet. They also recommend use of Internet blocking software, but only as a technical safeguard-not the solution to the problem.
Although it seems simple enough, what the NCMEC has done is somewhat out of place in the current Internet debate. Rather than treating the Internet as an entirely different, nearly mythical realm of existence, in which completely new and different codes of behavior are necessary, the NCMEC (and the few others who share their outlook) identifies cyberspace as another, technologically sophisticated area of the actual world. If an ethical system functions IRL, then it should also function in virtual life. The Internet does not necessitate the creation of new ethical codes, but new applications of ethical codes. Legislation and Internet blocking software, insofar as they are functional and constitutional, could provide strong additional safeguards, but the best way to prevent the harmful effects of objectionable material is to demythologize cyberspace.
Online Resources for Concerned Parents
Netparents homepage. http://www.netparents.com
Safekids homepage. http://www.safekids.com
Parent Time homepage. http://www.pathfinder.com/ParentTime/Welcome/
Yahooligans! homepage. http://www.yahooligans.com
Jones, Steven G., ed. CyberSociety. Thousand Oaks, CA: Sage Publications, 1995.
Brandt, Aviva L. "Watchdog Objects to Government Workers' Personal Internet Postings." Los Angeles Times. 6 October 1996.
Dunn, Ashley. "Even on the Net, Liberty is Not a Free-for-All." New
York Times CyberTimes, 9 July 1997.
Gorniak-Kocikowska, Krystyna. "The Computer Revolution and the Problem of Global Ethics." Science and Engineering Ethics, 2: 2, 177-190. 1996.
"Is Internet gambling on the horizon?" CNN Interactive, 25 June 1997.
Internet Blocking Software
CyberSitter homepage. http://www.cybersitter.com
Herhold, Scott. "Commerce secretary stresses computer industry solutions."
San Jose Mercury News - Mercury Center, 27 June 1997.
Muhammad, Tariq K. "Blocking the information superhighway: Employers are finding ways to." Black Enterprise, 31 January 1997.
Nelson, Brian. "Gaps found in Internet screening software." CNN Interactive,
25 April 1997.
NetNanny homepage. http://www.netnanny.com
Parental Control of the Internet.
SurfWatch homepage. http://www.surfwatch.com
"'Virtual toolbox' ready to block cyberporn." San Jose Mercury News
- Mercury Center, 16 July 1997.
Brekke, Dan and Rebecca Vesely. "CDA Struck Down." Wired, 26 June 1997.
Chen, Kathy. "China Bans Internet Access To as Many as 100 Web Sites."
Electronic Frontier Foundation homepage.
Clinton, William J. "A Framework for Global Electronic Commerce." White
"Countries face cyber-control in their own ways." CNN Interactive, 1
"EU urged to take soft approach to Internet laws. CNN Interactive, 7
German acquitted of subversive linking." MSNBC.
"Germany passes world's first cyberspace law." CNN Interactive, 4 July
Kaplan, Carl S. "Finding Government's Role in a Global Network." New
York Times CyberTimes, 10 July 1997.
Kynge, James. "Singapore cracks down on Internet." Financial Times,
12 July 1996.
"Many think First Amendment goes too far for free speech." San Diego Union-Tribune, 5 July 1997.
Meyer, Michael. "The Nation; The Internet; How to Put Borders on a Borderless Technology." Los Angeles Times, 14 January 1996.
O'Connor, Rory J. "Free speech online." San Jose Mercury News - Mercury
Center, 27 June 1997.
Schwadron, Terry. "Seeing red-and not the light-on the Web." CNN Interactive,
15 July 1997.
"Supreme Court CDA Ruling." CNN Interactive.
"Supreme Court strikes down Internet smut law." CNN Interactive, 26
"SurfWatch Plays Crucial Role in Overturning Communications Decency
Act. SurfWatch homepage.
Telecommunications Act of 1996, Pub. LA. No. 104-104, 110 Stat. 56 (1996)
"Vietnam set to edge open the Internet door." CNN Interactive, 24 June
Weber, Jonathan. "Leave children out of the decency debate." CNN Interactive,
4 July 1997.
"White House readies new policy for Internet decency." CNN Interactive,
16 June 1997.
The PICS Standard
Boyle, Alan. "Censorship debate focuses on filters." MSNBC.
--- "Internet watchdogs split over PICS." MSNBC.
--- "PICS adds new dimension to Web." MSNBC.
"Compuserve to Rate Internet Content by July 1." Recreational Software
Advisory Council (RSAC) homepage.
Lewis, Peter H. "Microsoft Backs Ratings System for Internet." New York Times, 1 March 1996.
"The PICS Scoreboard." SafeSearch homepage.
Eickmann, Lori. "Children at Play: What You Can Do to Make the Net Suitable for Kids." San Jose Mercury News, 1 June 1997.
"Internet Etiquette." The Washington Post, 27 February 1996.
Plotnikoff, David. "Matilda's Family Internet Guide." San Jose Mercury
News - Mercury Center
Click here for more ethics related articles.
- Mobile Technology and Social Media: Ethical Implications (video)
The impact of mobile technology and social media in emerging markets
- Markkula Ethics Center Milestones
Highlights from the Center's first 25 years
- Catholicism and Conscience
The interaction between personal conscience and church teaching
- The Costs of Partisanship (video)
A conversation with former Republican Congressman Mickey Edwards
Hackworth Fellowships for SCU Seniors
Applications due May 22
Research Grants in Applied Ethics
Applications due May 28
Hanson Delivers UP Commencement Address
Center director tells seniors, "It's not about you"
A Legacy in Ethics
Kristi Markkula Bowers reflects on her family's connection to the Ethics Center
- More News »