Markkula Center of Applied Ethics

After Dolly

Humility is the best stance when facing the unanticipated consequences of new technologies

By Tim Healy

The successful cloning of the lamb Dolly in February set off a spate of anxious questions. Many of them concerned the ethics of cloning, but another set asked about the unanticipated consequences of this technology.

  • What good might come from the development of cloning?

  • Is it possible to control cloning effectively?

  • Would the government's taking no action to control cloning be worse than a decision to take some specific action?

  • Should we halt research on cloning animals because it might lead to human cloning?

  • Would human cloning change what it means to be human?

Each of these questions speaks to the uncertainty inherent in the development of new technologies. And the true answer to each is, We don't know. Unanticipated consequences are a feature of change, both technological and natural. Acknowledging our limited ability to foresee repercussions is a first step in dealing with them.

The Revenge Effect
Why are consequences so hard to anticipate? Edward Tenner has a radical perspective on the phenomenon. In Why Things Bite Back, he describes a "revenge effect," in which our perverse technologies turn against us with consequences that exceed the good we planned. As an example, he offers power door locks, which increase drivers' sense of security.

But they have helped triple or quadruple the number of drivers locked out over the last two decades — costing $400 million a year and exposing drivers to the very criminals the locks were supposed to defeat.

Another analysis is offered by Dietrich Dùrner, who identifies four features of systems that make full understanding impossible: complexity, dynamics, intransparence, and ignorance and mistaken hypotheses.

Complexity refers to the many different components in a real system and the interconnections between them. When we model a system, trying to predict what will occur, we necessarily neglect many components — and even more so, their interrelations. But it is from such interrelations that the unanticipated may arise.

Many systems exhibit dynamics; that is, they change their state spontaneously, independent of control by a central agent. One of the most fascinating examples is the Internet, an extraordinarily dynamic system with no one in charge. There is no way to model the Internet to predict its impact.

Intransparence means that some elements of the system cannot be seen but can, nevertheless, affect its operation. Looking again at the Internet, contributors to intransparence might include all of the users at a particular time, equipment failures at user sites, and local phenomena such as weather. These intransparent factors make it hard to foresee how the system will operate.

Finally, ignorance and mistaken hypotheses can keep us from predicting consequences accurately.

Reducing Uncertainty
Given all the factors contributing to unanticipated consequences, it may seem as though there is no point trying to formulate policies to deal with the potential effects of change. But a pioneering work by economist Frank Knight offers four ways to decrease the uncertainties we face.

  1. We can increase knowledge by carrying out additional studies, analyses, or experiments.

  2. We can combine uncertainties through large-scale organizations. For example, a group of people can unite to protect one another against serious loss in a catastrophe.

  3. We can often reduce uncertainty through control, as the Federal Reserve does in its control of interest rates.

  4. We can slow the march of progress, taking time to research the question, write an environmental report, take the matter to the planning commission. That was the tack taken by President Clinton who, two days after news of the successful cloning of a sheep, announced a moratorium on federal funding for any more cloning until the matter could be studied further.

Moral Repercussions
These tactics reduce uncertainty, but it can never be entirely eliminated. In the end, we are left with a dilemma: We act without knowing the consequences of what we do. Yet we have to act, for even to do nothing has consequences.

Although there are no simple answers to this problem, some general ethical principles can guide us as we make decisions about the use of new technologies.

We should take advantage of opportunities to reduce uncertainty. Although we are not obliged to exhaust our resources (this could easily outweigh the good to be gained), to the extent that uncertainty can be diminished at reasonable cost, it seems morally prudent to do so.

People should share equally in the benefits of an action or a project, and they should also share equally in the risks due to unanticipated consequences. This is, of course, an ideal, since we cannot usually ensure such equal distribution of benefits and risks.

People who do not share in the benefits of an action should not, as a rule, be subject to costs and risks. Justice suggests that burdens should not be borne by those who cannot benefit; but this is also an ideal, and it, too, has limitations. For example, it would forbid the building of a coal-burning power plant on the grounds that emissions from the plant could affect the environment of the entire globe, including that of some individuals who could not expect to benefit from the electric power.

People who gain some benefit from an action should be able to choose their level of cost and risk. This idea follows from the fundamental ethical principle that everyone must be treated as a free and rational person capable of making his or her own decisions. Of course, joint projects do not always allow this principle to predominate. For example, a community may decide to initiate a flood-control project with consequences that interfere with the free choice of individual community members. Projects affecting more than one individual should provide the greatest balance of benefits over harms for all involved. This utilitarian principle gives us a way to approach public projects, such as flood control or seismic retrofitting. On this basis, we might decide to go ahead with the project, though our responsibility to reduce uncertainty would be greater because of the potential threat to individual rights.

We ought to recognize that a resource has greater value to a poor person than to a rich one. If we give $10 to a poor man, we improve his life much more than if we give the same amount to a rich man. Since the principle of justice suggests our first thoughts should be for those who have the least in our society, we must consider the disparate impacts of technological advances on rich and poor.

We ought to recognize that the consequences of an action may be long-term. Our actions and their consequences are not necessarily limited to here and now. In fact, their effects may cover great distances — perhaps the entire earthÑand may extend for years, decades, or even centuries. We are obliged to take these factors into consideration.

Decisions about technology should acknowledge the complexity of life. Implicit in this principle is the requirement to speak with humility about the consequences of our actions, to refine and improve our positions, and to act with a clear understanding that we do not "own the truth."

In a brief but beautiful autobiographical essay, economist Kenneth Arrow writes that "most individuals underestimate the uncertainty of the world." As a result, we believe too easily in the clarity of our own interpretations. Arrow calls for greater humility in the face of uncertainty and finds in the matter a moral obligation as well:

The sense of uncertainty is active; it actively recognizes the possibility of alternative views and seeks them out. I consider it essential to honesty to look for the best arguments against a position that one is holding. Commitments should always have a tentative quality.

Tim Healy, the Thomas J. Bannan Professor of Electrical Engineering at SCU, is coordinator of the Ethics and Technology Program at the Markkula Center for Applied Ethics.

Further Reading

Dùrner, Dietrich. The Logic of Failure: Why Things Go Wrong and What We Can Do to Make Them Right. New York: Metropolitan Books, 1989.

Knight, Frank. Risk, Uncertainty, and Profit. Boston: Houghton Mifflin Co., 1921.

Merton, Robert. "The Unanticipated Consequences of Purposive Social Action." American Sociological Review 1 (December 1936): 894Æ904.

Tenner, Edward. Why Things Bite Back: Technology and the Revenge of Unintended Consequences. New York: Knopf, 1996.