Skip to main content

Privacy Norms

Michael McFarland, S.J.

Through the experience of the last half century, and reflection on it, a set of norms for the collection, dissemination and use of personal data has developed. In the U.S. an early version of these were included in a 1973 government report entitled "Records, Computers and the Rights of Citizens." The recommendations in this report influenced subsequent legislation such as the Privacy Act and the Fair Credit Reporting Act. The Privacy Protection Study Commission, established by the Privacy Act, produced a more thorough and wide-ranging analysis and set of recommendations in 1977, in its report "Personal Privacy in an Information Society." In Europe in 1980, the Organization for Economic Cooperation and Development (OECD) issued its "Guidelines Governing the Protection of Privacy and Transborder Flows of Personal Data," which ultimately led to the Directive on Protection of Personal Data discussed in the last section.

Some private organizations that handle sensitive information have developed their own guidelines. For example Equifax has a set of "Information Policies," promising fair treatment and privacy to every consumer who applies for credit. 1 Several professional organizations in the data processing industry have codes of ethics, and some mention privacy, although most are too general to give any helpful guidance. However, the Association for Computing Machinery (ACM) Code of Ethics states some specific requirements for the handling of personal data:

It is the responsibility of professionals to maintain the privacy and integrity of data describing individuals. This includes taking precautions to ensure the accuracy of data, as well as protecting it from unauthorized access or accidental disclosure to inappropriate individuals. Furthermore, procedures must be established to allow individuals to review their records and correct inaccuracies.

This imperative implies that only the necessary amount of personal information be collected in a system, that retention and disposal periods for that information be clearly defined and enforced, and that personal information gathered for a specific purpose not be used for other purposes without consent of the individual(s). 2

Based on these sources, as well as the analysis of the issues presented here, this essay concludes with a set of norms for the collection, storage, use and transmission of personal data. The norms have three purposes. The first is to respect the autonomy of the subjects whose personal data is being used by giving them as much control as possible over what data is taken and by whom, and what it is used for. The second is to protect the subjects from harm due to the misuse or abuse of their data by controlling data quality and limiting its use and transmission. The third is to find a fair balance between the interests of the subjects and the welfare of society.

The norms are as follows:

  1. Consent. Personal data not already publicly available should not be collected and held without the consent of the person to whom it refers unless the information is necessary to protect the safety or security of the public. Consent can be given explicitly, or it can be granted implicitly through entering into a contract or participating in an activity that clearly requires granting access to the data. For example when someone uses a credit card, it is understood that the merchant is given access to the person's credit standing. But the concept of implied consent should not be stretched too far. There should be some activity on the subject's part that can reasonably be interpreted as giving informed consent. It is always easier to assume consent when disclosure of the information is part of a transaction that benefits the subject.

    There are a number of factors that affect whether the public interest justifies the disclosure of personal data without the subject's consent. One is the severity of the need. A situation that was a serious threat to human life, such as a medical emergency or very dangerous criminal activity, is more likely to call for disclosure than one with lesser consequences. A second factor is the sensitivity of the data and the harm likely to be done if it is revealed. Especially sensitive information the revelation of which could leave a person vulnerable to humiliation or prejudice, such as a history of psychological problems or some past behavior considered deviant, should be given more careful protection than someone's street address or blood type. Another important consideration is the context in which the information was originally given. Information revealed as part of a privileged relation, such as therapist-patient or lawyer-client, has a stronger claim to confidentiality because of the promise of protection given in the relationship.

    Even when it is necessary to reveal personal data to protect the public, disclosure should be limited to the minimum amount needed. For example data needed for critical research should have identifying information stripped off or coded so that it cannot be associated with the subject.

  2. Limited use. Personal data should be used only for the purposes for which it was originally given. The same qualifications used in 1 apply here as well. The subject can give consent, explicitly or implicitly, for further use. And in some extreme cases, considerations of public welfare can override the need for consent.

  3. Matching. The use of matching programs should be limited to the verification of information already given. Matching is properly used to check for fraud in information given by the subject in order to receive some benefit or pursue some other interest. Matching should not make use of, search for or reconstruct information not already provided by the subject. This excludes "fishing expeditions," that is, wide ranging searches looking for irregularities where there is not prior suspicion of wrongdoing.

  4. Subject access. Subjects should know about and be able to inspect data that is held regarding them, and they should be able to correct any inaccuracies or gaps in the data. This means that the existence and purpose of any databases holding personal data should be publicized, especially to the subjects they refer to. Subjects should be able to inspect the data in their records, except what must be withheld to protect the privacy and confidentiality of others or for national security. There should be procedures that allow subjects to challenge the information held on them and provide corrections and additions.

    When a decision is made based on personal information in a database, such as the decision to refuse credit to someone, the subject should be informed of the source of the information and how it was used in the decision.

  5. Due process. Benefits should not be denied or other action taken against a subject based on a matching procedure or other analysis of personal data until the subject has had a chance to understand and challenge the procedure and the data it used. Automated procedures and the data on which they are based are too unreliable, acontextual and insensitive to human realities to be the sole basis of punitive action.

  6. Locality. Personal data should not be disseminated any further than is necessary for its legitimate and intended use. This is necessary to support some of the other norms. Where data is available it tends to be used, whether for unauthorized or prejudiced judgments, for demeaning publicity, or for blind searches. Moreover the further information travels from its source, the more likely it is to be distorted, misunderstood, or taken out of context.

    Most important, out of respect for the subject, personal information ought to be accessible only to those who have authorization and a need to know.

    One corollary of this principal is that the Social Security number should not be used as a universal identifier. As it is, an enormous amount of data, of many different types and from many different sources, is accessible through the Social Security number. It is used as an identifier for medical records, bank accounts, credit records, tax returns, voter registration, school records, employee records and many other purposes. On the one hand, the SSN is often treated as a secret password, in that it is sufficient to give someone's Social Security number in order to obtain that person's records. On the other hand, it is very public, in spite of recent efforts to make it less so. It is often printed on forms and letters, posted on lists and easily obtainable as part of a credit check. The Social Security number is widely used and widely disseminated, and most of the rest of a person's data follows right after it. 3 Furthermore, because the Social Security number is used as an identifier in so many disparate databases, computer matching of data from totally unrelated sources, often a questionable practice, is very easy. Yet the Social Security number is easy to forge or corrupt and impossible to verify, so the reliability of those matches is often dismal.

    Locality demands that each type of personal data have its own system of identification and that the identifiers be safeguarded with the same care as the data itself.

  7. Accountability. Any organization that collects, stores, analyzes, uses or distributes personal data should be held responsible for the accuracy, completeness, timeliness and security of the information entrusted to it. Those who handle personal data have been given a public trust and ought to be held accountable. They must have adequate procedures for verifying and maintaining the accuracy and completeness of the data they collect. They must make sure the data is kept current by updating it as the subject's status changes. They must guarantee that the data is available only to those who are authorized to have it and that the authorization extends no further than necessary.

    There should be an authority to oversee the activities of the data handlers, which can hold them accountable. Moreover there should be remedies for those subjects who are injured by malicious or irresponsible maintenance or use of their data. The burden created by error should be borne by the data handler, not by the subject.

Michael McFarland, S.J., a computer scientist with extensive liberal arts teaching experience and a special interest in the intersection of technology and ethics, served as the 31st president of the College of the Holy Cross.
1. The Equifax Report on Consumers in the Information Age, Atlanta: Equifax, Inc. (1990), pII.
2. ACM Code of Ethics and Professional Conduct," Association for Computing Machinery (1992), Sec. 1.7.
3. Simson L. Garfinkel, "Risks of Social Security Numbers," Communications of the ACM, 38(10) (October, 1995): 146.
June 1, 2012

Center News

  • Hackworth Applications Open to SCU Juniors

    May 23 deadline for applications

    The goal of the Hackworth Fellows program is for Fellows to promote ethical reflection and reflective ethical action among their undergraduate peers

  • Ethics Center Receives Record Donations

    "All In for SCU" Challenge Totals $34,510

    Thanks to our generous donors on SCU's Day of Giving. Donations help support our programs and promote ethical decision making in all aspects of life.