Markkula Center of Applied Ethics


From blocking software to legislation, strategies address children's access to objectionable material on the Internet.

By Joseph Westfall

How hard is it to find sexually explicit material on the Internet? Simply type the word sex into Lycos, one of the popular search engines, and you will have access to 180,102 sites that deal with this subject.

[Boy on Internet]
Photo by Charles Barry

Of course, these listings represent a range of sites. The first 10 sex citations found by the search engine Excite include Cafe Flesh, where you can view pictures of "kinky coeds"; sex therapist Dr. Ruth Westheimer's site, which contains information about preventing teen pregnancy; SexStreet; and a site on sexually transmitted diseases.

In that list lies the dilemma confronting parents, educators, librarians, lawmakers-in short, everyone who is concerned about children's access to sexually explicit material on the Net. Should young Internet browsers be protected from cybersmut? If so, who should determine what material is inappropriate? If children's access to obscene material is limited, will that also prevent them from obtaining useful information about human sexuality? And will it interfere with adult rights to free speech?

What's Different About the Internet?

In many ways, these questions are not new. Whatever the medium, pornography has been around for a long time, as have concerns about children's exposure to it.

Yet the Internet is not just one more technological wonder. More than any other device in history, the Internet makes global communication possible. Anyone with a computer and a modem can be a publisher or have access to vast quantities of information from around the world.

Writing in the Yale Law Journal, Jerry Berman and Daniel J. Weitzner conclude that this resource offers "a real possibility for a new, less restrictive approach to achieving long-standing First Amendment diversity goals." For them, as well as many Internet activists, the Net is the virtual embodiment of the democratic ideal of free speech.

This previously unattained freedom is, however, in direct conflict with the desire to restrict children's access to inappropriate material.

Internet Blocking

One response to this clashing of values has been technological. Concerned individuals have demanded a means of controlling what the Internet brings into their homes, schools, and libraries; and, predictably, various businesses have provided one: Internet blocking/filtering software.

With names like NetNanny, CyberSitter, and KinderGuard, these products market themselves as convenient ways to keep children away from pornography.

Such products also depict themselves as anti-censorship. They make no attempt to limit what appears on the Net; instead, they allow users to define what kind of Internet sites they do not want appearing on their own screens.

Here is a typical product description:

SurfWatch Internet filtering products help you with the flood of inappropriate material on the Internet. An alternative to government censorship, SurfWatch provides a technical solution to the difficult issues created by the explosion of technology.

Yet all such software packages-even the best-fail to block sites accurately, at least some of the time. The problem is in the filtering mechanism.

The earliest blocking packages fell (and some continue to fall) loosely into two groups:

  • services that blocked sites containing a word or string of words considered obscene or otherwise objectionable
  • services that had persons exploring and blocking sites individually

Although the former allows users access to far more sites than the latter (as it does not maintain a list of sites, but searches all sites for the "improper" words and word strings), it rarely works very well. In one popular system, all sites containing the word breast were blocked, including those dealing with breast cancer. (This has since been corrected.)

A more recent experiment with SurfWatch, a blocking package reputed to be among the best, allowed an Internet browser to view graphically explicit sexual fetish sites, while a New York Times article on Internet gambling was blocked. Both blocking errors are presumably the result of word or word-string searches.

The second service type, where actual persons review sites, also has limitations. First and foremost, it can only get to a fraction of Web sites.

Some services, such as SurfWatch and KinderGuard, use both types of blocking, employing persons to sort and block sites individually and using words and word strings to filter sites employees have not reviewed. This combination appears to be superior to either method alone. It does not, however, resolve the problems with the two methods; it merely reduces the likelihood of access to objectionable sites while respecting something of the boundlessness of the Internet.

Many blocking-software producers are willing to acknowledge this much. Jay Friendland, co-founder of SurfWatch, admitted on CNN's interactive Web site that his product is "part of a solution. It's not the complete solution."


A quick public response to the insufficiency of blocking techniques was legislative. In the United States, there were municipal and state regulations on the local use of the Internet, including attempts at cyberspace decency laws in New York, Oklahoma, and Virginia. (The New York law has since been challenged.)

But the greatest legislative effort was the passage of the Telecommunications Decency Act of 1996 (CDA). CDA mandates that anyone who, using a telecommunications device,

makes, creates, or solicits, and initiates the transmission of ... any ... communication which is obscene or indecent, knowing that the recipient of the communication is under 18 years of age ... shall be fined ... or imprisoned not more than two years, or both.

The law was passed to keep children from making inadvertent contact with cybersmut, but its provisions were so broad that it immediately encountered opposition from Internet and free-speech activists. It did not take long for a three-judge federal court in Philadelphia to rule that portions of CDA, such as the passage cited above, "trampled on the First Amendment and amounted to illegal government censorship."

This ruling successfully prevented the immediate implementation of CDA. In June 1997, the Supreme Court upheld the lower court's ruling, declaring important sections of the law unconstitutional (Reno v. ACLU). The Court wrote that "the many ambiguities concerning the scope of its coverage render it problematic for First Amendment purposes."

The Court further suggested that parents who wish to regulate their children's Internet access use Internet blocking software, implicitly affirming the belief in the functionality of a technical solution.

The PICS Standard

In response to the increasing drive for government control, the World Wide Web Consortium began to develop a ratings system for the Internet in 1995. The goal was an effective, nongovernmental means of regulating access to objectionable Internet content; the result was the Platform for Internet Content Selection (PICS).

PICS, in effect, is a computer language that enables Internet browsers and search engines to read ratings that sites have assigned themselves. Any system configured to read PICS has access to the content ratings information.

Furthermore, because the ratings information is not located within the filtering product but encoded within the site itself, the PICS standard is not limited to existing browsers or blocking software. This leaves room for private organizations to create their own products to block sites objectionable to those sharing their particular values.

Almost everyone sees PICS as the system with the greatest potential for actually keeping objectionable material out of children's Internet adventures. Yet relatively few sites are presently rated through PICS, and relatively few products presently recognize the language.

As of this writing, only a handful-Microsoft's Internet Explorer, Compuserve, and the blocking-software CyberPatrol, SafeSearch, SurfWatch, and SafeSurf-incorporate the PICS standard. Netscape promises to incorporate PICS in a later version, but does not use it now.

It is still unclear exactly how many sites presently rate themselves with PICS, and there is no way to tell how many sites intend to do so. Currently, fewer than nine of every thousand Internet documents (.87 percent) are believed to be rated, according to SafeSearch's PICS Scoreboard. Certain sites-particularly those that people favoring Internet filtering wish to see blocked-may never encode ratings at all.

Although a few Internet search engines-Yahoo! Excite, and Lycos-have pledged to seek content ratings from all the sites they register, most Internet filters do not presently block access to unrated sites and probably cannot be expected to do so until a fair number of sites incorporate PICS ratings.

The ratings are not mandated by law, which causes some parents and child advocates concern about the implementation of PICS as the industry standard. Proponents of PICS argue, however, that Internet ratings will follow the same path as film ratings. Although the film rating system is voluntary, most theaters will not show unrated films. Since most filmmakers want their films shown in most theaters, they rate them. Internet site designers and Webmasters, the argument concludes, will follow suit to be accessible to most browsers and blocking software.

Netiquette and Nethics

PICS, Internet blocking software, and legislation each attempt to regulate access to cybersmut. Since no method has been entirely effective, some "Netizens" have proposed a different route. Rather than forcibly removing material from Internet access, some wish to develop and adopt codes of Internet etiquette ("Netiquette") or ethics ("Nethics").

Such codes would encourage more responsible behavior regarding the availability of cybersmut to children, as well as other moral dilemmas presented by the Internet. Cyberspace, under this view, is understood as a community separate from the communities in which users actually live. Just as actual communities have standards and codes, so should the virtual community.

In fact, some codes do exist, at least within various cybersubcultures. When examined, however, few explicitly address ethical issues, and those that do deal primarily with issues of privacy and plagiarism.

If these codes dealt more explicitly with issues of children's access to objectionable material, would that solve the problem? Just as some people choose to deviate from etiquette and ethics in the real world, there will be those who ignore virtual codes.

In the Meantime

Each response to cybersmut presents possibilities and limitations. Although a complete solution is not readily apparent, the National Center for Missing and Exploited Children (NCMEC) offers this advice to parents: "The fact that crimes are being committed online ... is not a reason to avoid using these services." Rather, NCMEC maintains that, fundamentally, parenting a child in cyberspace requires the same techniques, time, and involvement as parenting a child "in real life."

The organization recommends that parents meet their children's virtual friends, just as they would meet their actual friends. It encourages open dialogue between parents and children about objectionable material accidentally encountered on the Internet. It also recommends use of Internet blocking software, but only as a technical safeguard-not as the answer to the problem.

Joseph Westfall, former research assistant at the Markkula Center for Applied Ethics, is a graduate student in philosophy at Boston College. He wrote this article for a joint project between the Ethics Center and the Tech Museum of Innovation in San Jose. With assistance from the Center, the museum is incorporating an examination of ethical issues into exhibits at its new facility, scheduled to open in 1998.

Online Resources for Concerned Parents

Netparents homepage:

Safekids homepage:

Parent Time homepage:

Yahooligans! homepage:

Further Reading

Balkam, Stephen. "Content Ratings for the Internet and Recreational Software." Recreational Software Advisory Council (RSAC) homepage:

Eickmann, Lori. "Children at Play: What You Can Do to Make the Net Suitable for Kids." San Jose Mercury News, June 1, 1997.

Kaplan, Carl S. "Finding Government's Role in a Global Network." New York Times CyberTimes, July 10, 1997. 07097encrypt.html

National Center for Missing and Exploited Children. Child Safety on the Information Highway, 1994.

Nelson, Brian. "Gaps Found in Internet Screening Software." CNN Interactive, April 25, 1997.