Traces of Ourselves: The Ethics & Politics of Databases
The following is a transcript of a panel discussion held at Santa
Clara University May 16, 2006, and sponsored by the Markkula Center for
Applied Ethics and the Center for Science, Technology and Society.
- Mitch Kapor, founder of Lotus Development Corp. and president and
chair of Open Source Applications Foundation
- John Arquilla, professor of Defense Analysis at the Naval Postgraduate
- S. Leigh Star, senior scholar at SCU's Center for Science, Technology,
- Kirk Hanson, executive director of the Markkula Center for Applied
Ethics, panel moderator
KIRK HANSON: Welcome to this evening's panel discussion: Traces
of Ourselves: The Ethics and Politics of Databases. Let me just make one
or two comments to set up the panel discussion tonight. The news last
week that the NSA was compiling a database of all the phone calls made
inside the U.S. and then using data mining techniques to search these
records makes it clear why this discussion is important. Many profound
ethical and political issues have been raised by this development and
others regarding databases.
How do we balance privacy and national security? Will these new tools
of electronic record keeping and data mining be used to preserve American
freedom or diminish it?
One of the scholars working in this field cited eight trends in databases
that set the backdrop for this:
- Number one, the increasing size, they're getting bigger, as we see
particularly with the NSA database.
- Number two, they're compiling and putting together increasing amounts
of personal information about each of us.
- Number three, they're increasingly invisible, collecting their data
by absorption and somehow in hidden ways.
- Fourth, there is the increasing circulation and sharing of those databases.
- Five, there is increasing coordination, matching and applying databases
to one another.
- Six, there's the increasing commercialization, i.e. selling of information
from these databases.
- Seven, increasing sophistication of access mechanisms which move away
from merely enquiring of the data, but instead finding ways of identifying
patterns in the data.
- And the last trend is, there's a focus on improving security for the
database owners, but not necessarily for consumers and those whose data
is in the databases.
That's just a sense of the trends that set the backdrop for this discussion.
What do we make of the new, phenomenon of databases that will play such
a major role in our lives in the years ahead?
We have an extraordinary panel to talk about this and to raise for us
the questions that have to be matters for our public agendas in the years
Mitch Kapor is the president and chair of the Open Source Applications
Foundation, a non profit organization he founded in 2001 to promote the
development and acceptance of high quality application software, developed
and distributed using open source methods and licenses.
He's widely known as the founder of the Lotus Development Corporation
and the designer of Lotus 1 2 3, perhaps the first killer app which made
the personal computer ubiquitous in its usefulness for business during
the 1980's. He's been at the forefront of the IT revolution for a generation
as an entrepreneur, an investor, a social activist and a philanthropist.
In 1990 he co founded the Electronic Frontier foundation and served as
its Chairman until 1994. EFF is a non profit civil liberties organization
working in the public interest to protect privacy, free expression and
access to public information online as well as to promote responsibility
in the new media. EFF is presently involved in a lawsuit against the federal
government over the data gathering activities of the NSA.
John Arquilla is professor of Defense Analysis at the Graduate School
of Operational and Information Sciences at the Naval Postgraduate School
in Monterey in California and is one of the leading experts on the use
of IT by the U.S. military. He is appearing here today not as a representative
of the Navy. His views are his alone and not those of the U.S. Navy or
He's been a consultant to the Pentagon on the Total Information Awareness
data gathering project. He's a senior consultant for the Rand Corporation
and frequently appears as a commentator on TV news and the pages of such
papers as the San Francisco Chronicle. Professor Arquilla is the
author of many books and works including Networks and Net Wars: The
Future of Terror, Crime and Militancy and In Athena's Camp: Preparing
for Conflict in the Information Age.
Our third panelist, S. Leigh Star, is senior scholar at the Center for
Science, Technology at Santa Clara, where she's also a visiting professor
of computer engineering. She's been professor in the Department of Communication,
U.C. San Diego and a professor of Information Science at the graduate
school of Library and Information Sciences at University of Illinois,
She is president of the Society for the Social Studies of Science, the
four S organization, an international group that coordinates research
on science, technology and society. For many years she's worked with computer
and information scientists and has also studied work practice organizations,
scientific communities and their decisions and the way that they use information
She's written extensively on these topics, including such works as Boundary
Objects and the Poetics of Infrastructure, and with Geoff Bowker,
Sorting Things Out: Classification and it's Consequences, which
particularly applies to databases.
Our first speaker will be Mitch Kapor.
MITCH KAPOR: Thank you so much for inviting me to participate
in the panel. The Markkula Center, I assume that's Mike Markkula?
MALE VOICE: It is.
KAPOR: I want to frame my remarks by talking for one moment about
the origins of the Electronic Frontier Foundation and why we started it
because they bear on this issue of warrantless wiretapping that we're
facing right now. EFF was founded by me and by John Perry Barlow to protect
the rights of citizens in the digital realm, to protect civil liberties,
freedom of speech and privacy, freedom from unreasonable search and seizure
on the Internet and on computer networks.
Barlow, who was much more the poetic of the two of us, used to say we
brought the Bill of Rights into cyberspace, suggesting maybe that that
was some special physical realm distinct from the world that we actually
lived in. I would always come on after John and explain to people, "Well
what John really meant to say," and I would try to calm people down
and talk about the imbalance that exists when the government overreaches
in the face of issues and problems. In 1990, that overreaching was in
the prosecutions of computer hackers who were breaking into computer networks
and causing problems.
The big issue was that there was no understanding and no proportionality
about what the appropriate response was to that because virtually all
of that was the equivalent of vandalism-not 100 percent of it, but most
of it. Yet it was being treated like it was some sort of crime of violence,
and they would go after the perpetrators-this is the Secret Service and
other government agencies-in an attempt to lock them up and throw away
We felt a moral urgency about intervening, in fact creating an organization,
raising this discourse and saying, look, civil liberties are at stake
here; we can't just pretend they don't exist. Ignorance of the new technology
is not an excuse.
We did a lot of good because it changed the dialogue; it legitimized the
kinds of questions that we're asking ourselves today. On the one hand
we could put the current controversy about warrantless wiretapping and
the data mining of telephone records into very much the same frame as
some of the high rhetoric of the early 1990s were about hackers.
It is actually pretty outrageous that unbeknownst to any of us, the phone
companies-perhaps not all of them, but presumably AT&T, Verizon and
others-were, in secret, without telling us, violating our agreements with
them as customers, turning over, without any legal process whatsoever
the record of all of the calls each and every one of us have made for
some unspecified time.
There was an AT&T employee who came forward-this is all allegedly.
This may or may actually all come out, but this is the way the story is
coming out. EFF got involved, there are multiple lawsuits and we could
easily feel really violated to think that a record, not the calls themselves,
but who we called and when and for how long, and the entire pattern of
that is sitting in some set of computers.
You don't have to be paranoid to imagine that various very bad things
could happen besides the stated purpose of "catching Al Qaeda",
because if you understand the history of what has happened with that kind
of surveillance and analysis without accountability, there's a long pattern
of abuse in all the areas in which this has happened. I won't go into
it, but it reinforces the notion that we can't and shouldn't trust government
and there's a sort of flagrant flouting and abuse of the laws that were
put in place, and we have to stop this. It's important for everybody to
get on top of it because this is so utterly disproportionate and beyond
the bounds of what is legal and what is appropriate.
You might stop and say, that's all there is to it. After several years
of EFF, I came to the conclusion that all this was certainly the case,
and I'm a supporter of the lawsuit that EFF is bringing. I'm not an officer
on the board of it right now. I also think there are aspects to this which
are extremely important to bring out, and that's what I want to spend
the last four or five minutes on.
I don't think that the kind of framing that I gave you is all there is
to say about this, and the problem is, in the current, highly partisan
atmosphere, it is impossible to get to a set of questions that I think
are very, very important. You have spin now on all sides. Government makes
their case: "We're going after the bad guys. You have to trust us.
Believe me, we're not doing anything we shouldn't be doing here."
Then the civil liberties people say: "This is the law, you broke
the law, you can't do this, this is black and white, open and shut."
What don't get asked are a set of questions like the following: Are the
current laws suitable for an era in which technology has changed very,
very rapidly? If you understand and I think we all do, that the technology
has changed dramatically, maybe the best way to put it is: It is now possible
to do various types of wholesale activities including data mining and
monitoring of lots and lots of things at once. The laws covering wiretapping
were written in the era when there was the FBI guy up in the attic with
a set of headphones on. It is an interesting question to ask whether we
need to be looking at the laws and are they still suitable for the purposes
for which they were intended or not.
Another question which never gets asked is what our legitimate security
interests are, and what does security reside in? I'm very unhappy to have
national security defined by George Bush and Dick Cheney. It forces me
to support organizations that take them to court for breaking the law
and maybe impeaching them. It wouldn't be a bad thing, but it does not
leave any room for a discussion about where is security? What is the meaning
of it in this day and age? How do we go about getting that? What have
we done? What do we need to do differently?
It's not even on the table and it's not in Congress either. Something
is really lost when the atmosphere is partisan, which it is, when the
laws are broken, when the technology changes rapidly. What we've really
lost is the democratic discourse about this, the ability to talk with
as much openness as possible. I would agree, not everything in these matters
can be completely open, but I think it could be much, much more open than
it is to talk about what the laws should be, what our interests are, and
in a way that people on both sides who have interests are not attempting
to club the other side into submission or appeal to popular emotions,
but are really trying to get at a deeper understanding of how we're going
to live together as a society and how we secure ends which are very important
to us, which is a measure of security and a measure of privacy.
My quest in this is not to be a partisan for one side or the other (although,
because I have an opinion and I've told you what it is, it would be incredibly
insincere to say that I didn't have a view on this) but to ask how can
we as a society move beyond this very stuck state we're in right now and
get a little bit deeper in terms of our understanding of what the issues
JOHN ARQUILLA: Thank you all for being here tonight. My personal
philosophy aside from staying hydrated is: a well informed public is the
basis of sound policy, and I want to second Mitch Kapor's comments on
that point. This is an issue that lies well beyond red and blue. It is
political but I think must be approached in a non partisan way if we're
going to get any purchase on arriving at good solutions.
I'm also grateful to be speaking in the setting of an ethics center. Why
is it good to be speaking in an ethics center? Because ethics for me is
about living in the gray zone. It's not easily defined as black or white.
Ethics is the light that pokes a little bit of hole in the darkness, as
Robert Luis Stephenson once said about the lamp lighters in foggy London.
It gives us a handhold for thinking about where to find that equilibrium
between security and privacy. That's not a firm, clear line, particularly
not in the time that we live in. We're now in the fifth year of the first
great war between nations and networks-nations on one side, a variety
of networks on the other.
Networks don't have a particular homeland; they're distributed in many
different countries, including this one. The nations are following the
old traditions of many centuries and obviously continuing to do so in
the years since this war began. We've basically kept trying to solve the
problem of terror networks by attacking other nations, and I'm here to
suggest to you that that's simply a case of taking a hammer to a ball
The problem with databases is very much like that as well in terms of
the approach that has been taken, which is to gather up every last bit
of information you possibly can and then begin to try to find something
useful in it. While I'm very sympathetic to Mitch's point about the civil
liberties issues, I would add to it the practical issue, which is that
submerging yourself in an ocean of data is hardly the way to do good pattern
analysis or traffic analysis of what the bad guys are up to. In fact by
getting the calling records of all of you and me and others, what are
we doing? We're going to create a situation where we will generate more
of what are called false positives than Carter has liver pills. I think
this is going to be a setback for the intelligence gathering process.
Let me urge you all and all of us to listen in the coming days for some
discussion of people in officialdom saying, "Well we're not just
randomly going through all of this data. We wait until we have some kind
of clue and then try to focus on that clue and then access the data."
In previous decades this was commonly done on the basis of getting a warrant
first to do it. Here, as Mitch rightly points out, we have an ambiguous
legal situation and a fairly rapacious attitude on the part of many in
the intelligence community about grabbing whatever they can.
They have approached the problem and cyberspace in a very traditional
way, but the electronic frontier is a little bit different from what came
before and submerging yourself in this data is, in my view, going to be
highly counter productive.
How could we do things a little better? I think the first step indeed
is to try to open a dialogue that goes beyond red and blue. I go all over
this country talking to audiences and I'm struck by how many people already
know what they think, and I wonder why I go talk to audiences if they
already know what they think?
I believe my job is to raise questions. You already raised the important
question about this equilibrium between security and privacy. I think
another important question is what is the most efficient way to go after
our adversaries? In this regard I'll suggest to you that I believe data
mining has enormous potential. Our opponents are distributed over 60 countries.
They move their money all over the world. They move information about
cover, safe houses, training, recruitment etc. They live on that electronic
frontier. In some respects I think of it as an electronic wilderness.
When I walk on the beach in Monterey, I always have this sense that I'm
on the edge of the largest wilderness in the world, the ocean, right?
There is a virtual wilderness too and it's out there in this infosphere,
lots of which is comprised of cyberspace. How does one navigate this?
What are the most effective means? Data mining can be very, very good
if it is coupled with excellent field investigation and other tools that
are used in the intelligence business. As I suggested before, used by
itself it will be very counterproductive, but used in conjunction with
careful intelligence and police work, the potential is enormous.
I'm sorry to say to you that I can't give you examples of things that
have worked. I can give you a little bit, Khalid Shaikh Mohammad is one
of those people we were watching and doing the traffic analysis on. He
was very senior in the Al Qaeda organization and after a point there's
always an intelligence decision to be made, to keep watching or to capture
somebody. After a certain amount of time he was captured and even more
information was gathered. There are a handful of those. I can't really
give you more of those that are not in the public record, lest you all
be detained an I wouldn't want anyone else to be guests at Guantanamo.
There is a place for data mining, the bête noir of all of
this is Admiral John Poindexter who graduated first in his class almost
a half century ago from the naval academy, was a computer scientist when
computers were as big as this room and thought about these issues and
continues to think about them. He has a built in ethical meter (which,
if you remember his past from Iran Contra I guess sometimes gets switched
off) but that built in meter is his wife, who is an Episcopal priest and
he's a lively, intelligent man. I've done a little bit of work for this
TIA program, which started out as total information awareness and became
later on, terrorism informational awareness.
TIA is sort of the dark side of the coin from EFF. In any event I don't
think I successfully encouraged TIA in the right directions and was sacked
from the project shortly before it was totally shut down. You may recall
the final nail in the coffin was creating an electronic marketplace where
people could bet on nasty events, like the assassination of particular
leaders, and this was a mayhem future, as I called it. There were ridiculous,
tone deaf ideas.
At the heart of TIA is the notion of being able to move in that electronic
wilderness in wise ways, because that's really the only way we're going
to get at our opponents. We spend nearly $1 billion every single week
on intelligence in this country, most of it for satellites, big
ears that can listen to things and cameras that can take pictures of ships
and planes and tanks.
In other words, most of it is spent for things that are of absolutely
no use to us anymore. Part of this nonpartisan debate I'd like to see
emerge, would be some discussion of how to reformulate American intelligence.
Now unfortunately our big organizational change was to create yet another
huge hierarchy, the NID, which simply reinforces the old ways of thinking,
and frankly just a few pennies on each dollar of the intelligence budget
goes to help us explore that electronic wilderness. That is criminal negligence
in my view.
You mentioned hackers earlier, Mitch, and I think it is a shame in this
country that a hacker can do more time than an armed robber, in many cases.
In fact the relations between civil authority and hackers are terrible,
and they're almost as bad as those between labor and management in hockey.
It's a terrible, terrible situation. Why is this a tragedy? Who are the
rangers of that virtual wilderness? Who knows where the information is
better than anyone else in the whole world? The hacker does.
Thankfully, my masters have given me permission to get to know some of
the people in that community; the number of true master hackers who move
through virtual space the way you and I walk through this room is actually
very small. They're remarkable people who are drawn to the beauty and
complexity of cyberspace, and they're people who'd like to help, for the
most part. However we have this terrible relationship, and I've suggested,
including on the record, both written and spoken, that we need to recruit
hackers and treat them today like we did the German rocket scientists
I don't want to betray my age, when I was a young lad, about 50 years
ago, my great hero was Werner Von Braun, the great German rocket scientist
who gave us the space program. What was going on? Until the end of WW2
he was bombing the allies with ballistic missiles and cruise missiles,
the first ones, and yet he became a great American hero. I remember reading
his biography, I Aim For The Stars and the subtitle should have
been: But Sometimes I Hit London. That didn't stop Von Braun from
becoming a great American hero. The hackers don't have to become great
American heroes, but they should be cultivated the way the rocket scientists
were. We weren't the only ones hiring rocket scientists after WW2; so
were the Soviets and that was of course fuelling a great arms race. Right
now there are others who are racing, who are recruiting master hackers
and we're not. We're not in that race at all yet, and it seems to me one
of those terrible missed opportunities especially because we could both
end the pariah status of people who, again, are drawn out of intellectual
curiosity to this realm that is so very, very important to us.
I'm going to try to follow Mitch's example and not go too much longer,
so let me just share a couple of thoughts here and I'll give way to Leigh.
The war we're in now has a dynamic different from any other war. Warfare
used to be about the clash of massed forces who would grapple with each
other and try to find some advantage: turning a flank, attacking by surprise,
The dynamic of this war is not about the clash of masses, of "ignorant
armies on a darkling plane," as Matthew Arnold said. This is a war
where the fundamental dynamic is hiders/finders. If you have any sense
of a lack of urgency about this, let me just ask you this: We're in the
fifth year of this war. We know that all kinds of Hoover vacuum data collection
has been done on the bad guys. So, how are we doing, as Mayor Koch used
If this indiscriminate data vacuuming approach actually worked, we'd already
have won the war. Let me go back to the WW2 example: From very early on
both the Japanese and German codes were broken. There was a huge information
advantage. It wasn't just the triumph of mass against the Axis Powers.
It was a triumph of information and knowledge, driven by early forms of
high performance computing.
Our performance today is shameful in that realm. We know a thimbleful.
Yes, we have a few triumphs here and there, but in the main we could have
had a V8 these past several years; we could have done much, much better.
There is, I think, some time to do better, but don't be lulled by the
notion that "this is a long war," as Secretary Rumsfeld called
it. I don't think we have the luxury of decades here. Why is that? How
long will it take before a terror network has true WMDs? How many nuclear
weapons would Al Qaeda have to possess in order to coerce us into whatever
they wanted? One? Or zero if we believed they had one or two? It doesn't
matter that the Russians have 7,000 strategic warheads today because if
they hit us, we hit them. What's that insurance company, Mafia Mutual?
You hit us, we hit you.
That doesn't obtain when your opponent is a network. You've got nothing
to retaliate against. Indeed if there's ever going to be a nuclear Napoleon,
he's going to be in a network. I also want to suggest to you the urgency
to figure out this equilibrium between security and privacy and to get
serious about doing the data mining that will help us track financial
flows, the movement of operatives, and give us a chance to win this hiders/finders
war that we're in the middle of, that we're so bad at so far.
Maybe it will encourage us to stop trying to beat a network by attacking
other nations; that doesn't seem to have worked very well either. It's
not so much a war on terror as terror's war on us.
One closing thought: We're going to hear a lot about Big Brother in the
days ahead and it is Orwellian and it's inefficient and it's behavior
that has to stop. I think there is a model already out there and that
is Little Brother. Little Brother in the commercial sector already data
mines in innovative, often brilliant ways.
That's a model worth exploring. It's one of those areas where we can find
that equilibrium point, learn from the best practices that are out there
and by all means and above else, clean up our own act first. Thank you
for your patience with me.
S. LEIGH STAR: I'm going to deviate a little from what has gone
before, but I think in the same spirit. Both Mitch and John have looked
to the past and to whether the present technology that we have is up to
what we're doing. I've spent some time in the past asking those questions
about big old forms of IT that often are not very high tech, or by working
with people who are making this stuff or using this stuff and trying to
find out if they're talking to each other, which they aren't. It's also
an old question in terms of social sciences. I'm a sociologist, and there
are lots of aspects of the debate that's happening now that relate to
the oldest question of all: What makes us free and what makes us caged?
Max Weber, a sociologist, talked about the iron cage of bureaucracy at
the end of the 19th century, beginning of the 20th century, and by that
he meant, can we ever be free now that we've built all these incredible
bureaucracies and reporting systems and lists and ways that constrain
us from or natural selves? That became a phrase that is, to this day,
very common in sociology.
We also have in the 20th Century a lot of hype, and hype, as I've often
said, is one of the conditions that we work in. But there's also a lot
of truth to new worlds of free information and knowledge through connectivity.
I Google, you Google.... It makes a difference to daily practice. We're
also at a moment, as both the previous panelists have talked about, where
there's a loss of privacy, a new kind of iron cage of data mining and
surveillance, new infrastructures, not just new systems coming together
that are newly convergent
.[Speaker plays an on-line video from the
ACLU about ordering a pizza and privacy erosion.]
One of the things I've been doing over the past few years is asking a
question: What is infrastructure anyway? What does it mean when these
different kinds of small and large infrastructures begin to converge?
I have a lot to say about it, but the most important thing for this talk
is that infrastructure is a very complicated thing. It's not just like
the plumbing to the water that you turn on; it's not just the roads that
you drive the car on or the railroad that you drive the car on.
It's learned as part of membership, and you learn it as a member of a
community of practice, whether that be as a child, as we usually do, or
later on if you move to a new country, you know that you have to learn
every little piece of new infrastructure. I remember completely losing
it after moving to England when a key to the Fiat that I just bought didn't
work like any key I'd ever seen in America. I just went hysterical and
walked two miles home up a hill because I couldn't take one more thing
that wasn't the same as the infrastructure I was used to.
Nobody is really in charge of all infrastructure. There's no centralized
processing bureau to deal with infrastructure and partly for that reason
and partly because it wasn't built that way, it can't be fixed globally;
it's fixed in modules. All the things that we're talking about here have
these qualities, which are important when we're thinking about new laws,
new ways of doing infrastructure; making it something that's alive or
at least something that we can be negotiating about.
I've learned most of that originally from the Americans With Disabilities
Act and from the disabled community: One person's infrastructure is another's
barrier. We know, therefore (and this is speaking metaphorically but not
that metaphorically), people in other infrastructures create new workarounds,
new infrastructures. This is a wonderful thing. I just actually found
today on Google a 3D imaging program that identifies enough body heat
from someone sitting in a chair waiting in an elevator, so they don't
have to press buttons, which is often problematic for people in chairs.
This [referring to a slide] is somewhere in West Africa just going along
or against the traffic, or as the case may be, using a wheelchair in a
way we don't often see here, or a standing wheelchair. People who live
in different relationships to different infrastructures create different
I want to step back and look even further at some even more basic processes
related to these kinds of questions, not specifically wheelchairs and
not specifically pizza, but about infrastructures and in this case specially
about information infrastructures. Geoff Bowker and I have spent many
years studying large scale infrastructures, as I mentioned, old and slow
infrastructures. One of the things that we studied was the formation of
a global system of data collection for diseases: the International Classification
I'm going to talk about four F's [in this regard]. The first one is forming.
This is a moment in time where there are a lot of alternatives about how
to speak about [illness, for example] the process of getting old and a
lot of different adjectives, a lot of different modes and feelings that
subsequently have gone away. This is from the 1930's: a tabular list of
conditions related to old age and senility. I really feel a loss because
you no longer can die of being worn out, and I think we should bring that
back, especially those of us that work in universities. You can't even
die of old age; you have to have some kind of specific cause: a germ,
a gene, a condition, an accident.
The second kind of F is for processes of fossilizing. Now this doesn't
always happen but it happens frequently enough and I think this is the
moment that this begins to happen that the surveillance coupled with fossilizing
becomes urgent to us. By fossilizing I mean different information systems
begin to delete alternatives, like you can't die of old age anymore.
This is a picture of a passbook carried by a black African man under Apartheid,
and under that regime there were four and only four racial categories:
white, black African, Asian or colored, which was some mixture, and there
were privileges given to people according to race. I could go on about
that, but data was filed by those four rigid areas, collected by those
areas and in addition, speaking of fieldwork and getting out there, people
were slotted into living areas, types of jobs, access to education. It
was a life and death condition for people that lived under Apartheid-sometimes
for white people, mostly not.
The third F is fermenting. The consolidation of different points of view,
different worlds leads sometimes to two fossilized worldviews coming together
or something like the loss of a category such as senility. Sometimes social
movements can overturn aspects of fossilized systems. In 1973 the American
Psychiatric Association de medicalized homosexuality. It was no longer
automatically an illness to be gay or lesbian.
AIDS is another example of the kind of things that ferment around
different fossilized forms, particularly of medicine, which is what I
know best. There are always local workarounds for data entry and retrieval,
most of which never make it into the formal record and which you can only
know if you work in a place or if you're a sociologist that has an eye
to such things.
This is such a perfect example of a workaround I couldn't resist it: These
are high heels. Often women are forced to wear them, or at least in the
past were; I don't wear them anymore. But what's interesting is yesterday
in my mail came Nordstrom's latest catalogue which has in it six different
devices that are workarounds for helping you wear these things, which
are not made to be worn by any human feet. These little pads and things
you can put between your toes and things that stop the pain of the strap,
you get the idea. I mean shoes are funny, but in this case also it's a
metaphor for large scale information systems where standards that don't
work for most people's feet, most people's whatever box you want to tick
off, are met with concrete workarounds and I'll finish this very quickly
with an anecdote.
A friend of mine who's an academic in Massachusetts undertook to be psychoanalyzed,
which is very expensive. When she first started it, Massachusetts would
pay for about 10 sessions a year, a considerable amount of money still.
Every year she and her therapist would get together and decide on the
category that they could put down [in her medical record] so that if in
the future-and they were thinking of this kind of subpoenaing of records
or data mining or something-if she were targeted for any reason, what
could they put down that wouldn't be too bad to find on her record? You
don't want schizophrenia etc. They finally found the perfect category,
which is obsessive compulsive. Now those of you that are professors will
know exactly what I mean, those of you that have ever been to college
or had to work with us also know that you will never get fired for being
obsessive compulsive in your job as a professor.
The last F: fissioning, where many different kinds of category systems,
and I don't mean this in the Maoist thousand, million flowers blooming,
but many different kinds of data collection systems travel side by side
and are used side by side.
Skipping over this quickly, I'll just speak to one which is the question
of ethnic data collection and whether it's a good thing or a bad thing.
This year, of our incoming class, 23 percent of students declined to state
their race which is a double-edged sward. For those of us who believe
one should never be identified by race in the best of all possible worlds,
that's good. From the point of view of identity politics and people trying
to figure out what kind of school they're going to be coming to and all
kinds of other things, including humanitarian ones, which are pretty big
on this campus, it's a terrible thing. So how do you balance those two
things? This tradeoff is working everywhere, not just at SCU but NIH and
the Census Bureau, lots of different kinds of places. It's a really serious
Let me finish. I'm not a Catholic, I'm not even a Christian, but here
I am and I work here and I've learned some things from the people around
me. I should attribute most of this to John Staudenmaier, S.J., who spent
a year last year at our center. I've learned a way of thinking that might
be a little bit more on the gray side, a little bit more on the ethics
side, that I do feel awards this particular group the status of community.
What John taught me is that Jesuits believe that the world is good before
it is evil, that knowing knowledge, I guess you could also say information
and initiation are lifelong. Darkness and uncertainty may be holy or obscuring
or both and that discernment with trust keeps knowledge alive as it has
in this community for more than 400 years, so thank you very much.
Questions and Answers
Audience members presented written questions to the moderator, Kirk Hanson,
who fielded them to the panelists
HANSON: Let me ask one of these questions which is a very broad
one at first, which I think will set up the others and that is, should
we no longer expect to have privacy in the Internet age and the database
age? Or will technology provide us a way, eventually, of indeed achieving
both the benefits of databases and our own personal privacy? How do you
all see the future at this point? Is there an inevitable trade off between
KAPOR: It's always mystified me that there are a minority of people
who think that we should just get over the idea of having any privacy.
Included in that group would be science fiction author David Brin and
the former CEO of Sun Microsystems Scott McNeely. I mean it seems to me
that privacy is something worth struggling for because people have a right
to be let alone. It feels like a fundamental freedom not to have to conduct
one's entire life in the sight of goodness knows who.
I guess people differ about what they feel is important, and people also
differ about whether they think there's some technological determinism
here, that, whether we like it or not, would make privacy impossible.
I no longer believe that technology determines anything. It's what we
invent and what we chose to do with it.
I think we want to keep our privacy, even though the boundary of what
is private and not is certainly going to change. I'm not a historian or
a scholar, but I understand in the Middle Ages, everybody lived in one
room and the boundaries between work and family life are not what they
are today. If you owned some sort of, not a business, but a craft or you
had apprentices, everybody lived together. We have boundaries that are
different today. No doubt boundaries will change again, but those people
who feel it's important to be able to have insides and outsides and give
people choices about them, have to do something to be prepared to see
that the technology and the policy don't run in a direction that makes
ARQUILLA: Let me just second that and go to Mitch's earlier point
about laws needing to catch up. We can have privacy if we chose to protect
it, and I think that's the key. This point about technological determinism
has to be avoided. You can't simply accept; we do retain choice. I would
also say, in terms of the state of technology today, those who would encrypt
their data have a huge advantage over those who would intrude upon it,
and that means all of you, too, thanks to other people at Sun Microsystems
like Whit Diffie and some others who gave strong encryption to the people.
Even at a technological level, I don't think the game is over either,
but we have to begin, I think, with the choice that privacy is a value
and it's one that we will protect.
STAR: I think like other values it's unbound in the abstract.
It always exists in relationship to something, some action, some community.
We need to think about that. I also think we don't need to be killed with
the death of 1,000 blows. Just this week I got my 11 millionth thing from
Capital One giving me a line of credit, and I finally said, all right
I'm going to do the opt out thing. The opt out thing took well over an
hour of my time, will cost me two stamps, figuring out where an envelope
the right size is, etc., and that's just one opt out. There's no global
delete as we might say in databases for a lot of these things.
HANSON: The second question has to do with the positive aspects
of data mining. John referred to Little Brother as perhaps a model. What
do you all see, each of you, as positive trends or positive uses of data
mining that point us to a better future?
KAPOR: Many of the most interesting new consumer Internet services
use a form of data mining in aggregating together lots and lots and lots
of individual preferences in ways that are more globally useful. I mean
[things like] the book recommendation function at Amazon, people who bought
this book also bought
, and it lists some titles. I think a lot of
people actually find that to be highly useful and that the trade off that
my Amazon purchases go into that database, although they're not known
to be by me because the data is all taken together in the aggregate, is
useful. We're really just at the beginning of that. This whole idea of
the wisdom of crowds or the appropriate aggregation of large amounts of
many individual preferences has incredibly valuable information is really
at the heart of all the music recommendation services and a new generation
of services that are helpful in doing content discovery on the Internet.
So, I think there's a good future in that with appropriate safeguards
as in, you can opt out of the system entirely and a safeguard that the
individually identifying information is guarded and protected. It's a
win win as one example.
ARQUILLA: I try to find good examples of data mining from other
fields and among those I am most impressed with are how a number of medical
organizations are mining data. You probably know this better than I, Leigh,
but they are just doing remarkable work in terms of early pattern recognition
of threats to health but also in terms of the spread of effective practices
for dealing with pernicious diseases in many parts of the world.
I think another area a little closer to the world I live in has to do
with police, and this is not just American police, but international police,
who are becoming outstanding data miners. They provide a much better model
for how to gather intelligence about networks because they've been dealing
with criminal networks for quite some time. We all hear about how China
tries to control the Internet in their country, but the constabulary in
China is among the best in the world at data mining because they've had
to deal with these criminal networks, the Asian triads who've been around
for centuries and know all about hiding and lateral connection and small
So there is a great deal to share with each other, and if we learn that
the war on terror had a word at the beginning, called global, if we reached
out, we'd find there are many examples of how to do things quite effectively.
STAR: I agree with all of that very much. There's another kind
of case, which is that if you are, for any reason, in a rare situation
and you know that there are others around the world, but you don't know
how to find them and they're not to be found in your locality, using types
of data mining with everybody's permission and correct precautions I think
can be really good. My mother passed on some years ago from a cancer and
there were only 400 cases in the U.S, and she found enormous comfort in
finding people from around the world who could speak to the minutia of
what that condition was.
HANSON: This goes back John to your comment just now about how
other police forces maybe doing a better job. There were several questions
focused on how we defend ourselves from a network. Do we need to collect
data on the behavior of all citizens? Or can we draw the line at something
less than that? Those, who are in some sense suspect or potentially involved
in networks, or is it inevitable that we have to collect data on all citizens?
ARQUILLA: No we don't need data on all citizens and in fact gathering
data on all citizens makes the job of finding terrorists harder. That's
my biggest problem with what's going on with this list of everybody's
phone calls. The real trick to beating a network is to build your own.
We're trying to defeat a network by using the national model. The President
made it very clear, April 13th, 2004, at a press conference when he said,
"We are going to win this war by turning other countries into democracies
by any means necessary." Of course in Iraq we have knocked over a
regime and created a hothouse environment that breeds terror. Again, whether
you're red or blue on this, it's pretty clear that changing nations is
not the way to fight a network, or it is at least an extremely fraught,
very costly, and problematic approach.
Far better if you're going to gather this data, make sure you share it
around. Right now when I go back to Washington and talk with some senior
folks, well these are my own remarks, I'll say it includes General Hayden
among others, and I say you've got to build your own network. The answer
is: I'm already networked; I talked to all the heads. Now anybody who
knows anything about networks knows that it's not about talking to the
heads of the organizations; it's about people within the organizations
talking with each other. I won't belabor the point; we have to build our
own networks. Unfortunately since 9/11 the only two organizational changes
we've made have been to create huge, bulky hierarchies: the Department
of Homeland Security, which already before Katrina, everyone knew didn't
work. After Katrina it was with an exclamation point and the world looked
at DHS aghast. I think people already realize that the NID is equally
bulky and is not going to work either. It takes a network to fight a network.
HANSON: Do you want to make any comments, Mitch or Leigh about
whether we need to collect data on all citizens? Are there some types
of data that you believe are legitimately collected on all citizens?
KAPOR: People who know about IT and large scale systems can tell
you that one thing that would help is not collecting data on everybody,
but is getting the basic IT in government working properly, which it doesn't,
and the kind of ossified, hierarchical and bureaucratic environments make
that a virtual impossibility.
You see somehow there's a sleight of hand when the question on the table
is: Do we have to listen into everybody? It fails to ask this question
that I realized, which is: What does security reside in? Some of it resides
in alternative types of structures-less bureaucratic, less hierarchical,
more networked, more distributed for gathering collecting and sharing
intelligence, different ways of doing things that don't even cross the
boundary necessarily of this: we all have to give up our privacy.
We're suckers of a con game if we find ourselves defending privacy against
the claim that this is what's necessary to fight the war on terror. I
would love to get to a point where really that is the question because
we've picked the low hanging fruit, we've done all of the important reforms
that we should be doing and we're being as effective as possible in a
realistic way. But until we get to that point, please, I say to people,
don't give me false dichotomies; that's political spin.
HANSON: There are a couple of questions about the realities of
engaging in war against terrorists and the potential for obviating law
or skirting the law based upon the war making powers of the President.
John, maybe you're more familiar with this. No matter what laws we pass,
will we always be subject to the exercise of the prerogative of the presidency
or of the war making authority to put those laws aside?
ARQUILLA: I think this comes and goes in history. Woodrow Wilson
made his academic reputation on a book called Congressional Government
that said Congress was too strong and the president was too weak. Then
along came Teddy Roosevelt, who was a very strong president. We've had
a century of quite strong presidents in the main. I think it's time for
the wheel to be moving back, the pendulum to swing back a little bit.
It's pretty clear, I think, that the presidency exercises too much authority.
I would say this in a qualified way though, and again beyond red and blue,
a lot of democrats supported the invasion of Iraq. Many of them still
do, just saying, well, we've got to do it better. They want to get that
bigger hammer to go after the ball of quicksilver. I think this is a bipartisan
problem, and this is one where civil society is going to have to speak
up and call for this change, this redress, first elevating the debate
and secondly redressing the business.
I think the War Powers Act needs to be revisited. Congress has always
asserted more authority than it has been able to exercise. The president
has never accepted the authority of War Powers. If you look at that act,
you'll find that it allows the president to deploy force for only very
short periods of time. That's never been followed. I would say also as
an Italian American, it offends me that the last country the United States
ever declared war on was Italy, and that was nearly 65 years ago Why are
we not declaring war on people if we're going to war against them?
The larger question of course is, can you follow the rules in a war where
the enemy breaks all the rules, targets civilians, and hides in the vast
sea of the global population. The answer is that sometimes, (I think this
is somewhere in Thomas Aquinas) you can make a decision that [breaks the
rules] as long as it does more good than harm; you can take that action,
even if it is on the edge of that gray area in ethics.
For me, I'm not a Jesuit by proclivity; I'm much more drawn to Augustine,
who led a life of great debauchery before finding the light. I find that
this war is very much an Augustinian one and everyday at some point I
say his great, great prayer: "Save me Lord, but not yet."
STAR: It's actually, "Make me chaste, but not yet."
Just to pick up on the topic of whether data needs need to be collected
on all individuals. First of all, what makes us think that that's possible?
The Census Bureau hasn't been able to do it.
Second of all, all information collecting is relevant to purpose. I think
Mitch was saying about spin that it's just a false dichotomy, and what
happens is that you're erasing purposes when you think you can collect
data on all people, and that's just morally wrong.
HANSON: There are a couple of questions about the mindset that
has created this crisis at the moment. I'd be interested in your assumptions
about what motivates people in Washington or in the national security
environment to say we have to scoop up everything? Or, to add another
question here, what motivates those who abuse databases, commercial databases
or other non intelligence databases, and use data in abusive ways? What
are the mindsets behind the abuse in a national security environment and
in a commercial environment?
KAPOR: One assumption is that sometimes people with a responsibility
to go and solve a problem, whether it's in government services or in business,
take the attitude of, "I know what tools I need to get the job done,
so let me make those decisions. What you want me to do is get the job
done, and I will get the job done for you."
I just think in a democratic society that does not wash. I mean, what
are we fighting for and what are we fighting to save? Sometimes it's a
disagreement about the conditions under which the work is performed, and
I think citizens have a right to have confidence and understanding in
the main about what tools are being used and where the lines are. Sometimes
they do need to be crossed into a gray zone, but there needs to be a kind
What I found and continue to find distressing is that sense of importance
of accountability is not there. It seems to have vanished completely in
the current administration. People say, "But we're security professionals.
This isn't new to the Bush administration. Let us do our job and we'll
make you safe." I think that is just completely a terrible bargain
we should not have taken up.
ARQUILLA: I think a lot of these people are actually high-minded.
They're trying to do the right thing, they haven't a clue how to do it,
and the intelligence professionals of long experience like General Hayden
come out of a world of traffic analyses where more data is better. In
that previous world, that's right, but in the world and the infosphere
as it exists today, more data can actually obscure rather than clarify.
That's the biggest conceptual problem they have. I'm less inclined to
impute evil motives I think, and competence is right up there too.
KAPOR: It's very important that the intelligence community be clued
in. They are very smart people and they work very hard, but if the mindset
is rule based and bureaucratic and has the wrong kind of model of just
needing to talk to the other heads and compartmentalize everything and
everything has to flow up to the top to be vetted, then it is just going
to produce bad results; it's going to miss things and it's not going to
serve whatever good purposes it might serve.
HANSON: There are a series of questions I'd like to pursue about
how we can get some leverage on this problem of dealing with the abuse
of databases, and there are at least six questions here which deal with
different parties that might play a role. One of the questions is: What
really is preventing us from using the hackers? Is it simply distrust?
Does Mitch think hackers might be used to help us with terrorism and used
in the database world in a positive way?
KAPOR: Not casually, but potentially. Let me just say, it would
be interesting to think seriously about how you might do that. The best
hackers I know, they run their own show and don't answer to anybody. If
you have a specific purpose or project in mind, I guess the question is:
Would a commercial company or an open source project hire an incredibly
brilliant hacker to work on this before we figure out if the CIA or NSA
should? I'd say, well, it's interesting but you'd have to have the right
kind of deal and think about it and really believe that the understanding
was going to hold up. Under some circumstances, yes, but I'd proceed cautiously.
I'd do some experiments. I'd try to have some learning about it before
I had a big reliance on it.
ARQUILLA: I think the key word is to approach it in a serious
and a deliberate manner. Again I can't say a whole lot because I don't
want those of you who are left to be detained, but the fact of the matter
is we have reached out a little bit. We need to do so far, far more.
Here's my little story. As I say, part of my job is to try to understand
the world of the master hackers. One we had took years to coax like a
shy woodland animal to even have any kind of contact because to these
people I'm the Prince of Darkness. They're very cautious. We got to a
point, it's all based on trust. It's social; it's not a technology story.
I asked him some questions based on information that I already knew the
answers to or felt that I did, based on highly classified information;
again I can't give you details. He came back to me with answers to these
questions in a very, very short period of time measured in days, not weeks,
with all the right answers, at least what I believed to be the right answers,
and then some additional information.
I handled it as a kind of control experiment, and that's not the only
time that was done. My belief is that there's great value here. It is
something that is being considered seriously and very, very cautiously
because it goes against all the habits of mind and institutional interests
of the intelligence community. Frankly the law enforcement community is
the hardest opposed to this because they simply see hackers as criminals.
I think they need to be redefined in a larger sense before we're going
to be able to proceed in a more systematic way.
STAR: I just want to say I've tamed a couple of hackers. I worked
at the MIT AI lab for eight years and I did it by telling them respectfully
about a world outside their own world. I worked there as an ethnographer,
and I would suggest that hiring a hacker and an ethnographer would be
the really radical thing to do.
KAPOR: I think there are things that we could do besides hiring
hackers. My office is south of market in San Francisco, and within six
blocks there are 50 companies with young people doing amazingly interesting
things and they don't have as many socialization problems as the hackers.
None of them, virtually, have ever given a thought to government service
and probably for good reason because of the condition of what they would
find. Why do we live in a world where we take that as a given? Is it possible
to live in a world where the best and the brightest, not necessarily the
hackers, have more interest, where they feel it's the right thing to do
to go and help on the big problems of the day, where they would be welcome?
That's the bigger challenge in all seriousness. Somehow we've gotten to
a state where the majority of the really smart, ambitious, creative, IT
people would no more give thought to helping with some of the kinds of
issues we've been talking about, than they would go into a seminary, which
is a topic for another day. I'd like to live in a world where [working
on these issues] is seen much more as an interesting honorable alternative,
not something for a very small minority of people to do.
ARQUILLA: We do a little bit of this. The school where I teach
at is home to the Cyber Corps. Some of you may remember that President
Clinton put this legislation into effect, and we do have some people who
end up working full time in this area. They get their tuition paid and
a little stipend. We're trying to expand it to include part timers.
I think one of the more interesting efforts underway today is to try to
broaden our approach to the reserves. You all know the reserves are very,
very overstretched in Iraq, and now for some crazy reason they're going
to be driving trucks down on the Mexican border.
We are redefining specialties and trying to reach out to the IT community
as well. These things aren't mutually exclusive. Hackers, yes, because
they're the best top guns, but there are also, as Mitch says, an entire
generation of the brightest people who are very skilled in IT and we're
trying to reach some of that and appeal to their patriotism and finding
that it's an appeal that resonates with them.
HANSON: I'm going to ask three questions in one here as the last
question and that is, How much can politicians, members of government
who were all born after the Internet revolution, who perhaps are not very
sophisticated in regard to computers, how much help are they going to
be in resolving the privacy verses security issue, or the database abuse
question? How much help is the press going to be in this struggle? Finally,
what can private citizens like us do to help in this on going debate and
on going need for some resolution to the privacy versus security and database
usefulness versus abuse controversies?
KAPOR: When the politicians are bought and paid for by defense
contractors and other special interests, the big telecommunications firms
as they are today by and large, they're not going to be of any help. The
real question is: Is it possible? Can we imagine getting politicians that
actually are elected on the basis of representing the people's interest?
So that's the problem that we have to solve because in that case they
could be very, very effective.
ARQUILLA: I agree. I think it's time for a new progressive era.
A century ago politicians were in the pocket of large corporations not
unlike today, and the people rose up in a series of progressive reforms,
particularly here in California. I think it's time for that again. We
have the technologies for social networking today that go way beyond what
was available back then. If they could have progressivism that came from
the citizens a century ago, shame on us if we don't replicate that today.
STAR: I just want to say that, yes, if I could do it in such a
way that I didn't have to leave my networks or traduce myself, I would
consider it. PUt it that way.
HANSON: If these average citizens in the room want to join your
progressive moment and make use of the IT at our disposal, to organize
networks of progressives, what should they do?
ARQUILLA: Well, they're already out there in many places. I would
connect with them. People don't realize the reds in this country already
do this. One of the reasons there's still something like half the country
supporting policies that are hugely ineffective and terribly costly is
that they're highly networked. Karl Rove was a master of mass mailing
marketing early in his life, and he has applied many of those skills and
a lot of networking skills in this area and it's used for all sorts of
red causes. On the blue side, Howard Dean was actually quite good at this
until the Democratic establishment I guess decided to line up and execute
his candidacy the last time around. The point is that in the political
realm they're already doing this. Organizations exist out here. Hok up
with them, start your own. Who knows how Critical Mass started in San
Francisco?... Has anybody ever watched that? I just go there to watch,
or I used to when I had the time to do it. It's magnificent, that's people
power. It shuts down the downtown completely. "Why are you driving?"
is what it says to people, and it's civil and it's powerful.
Civil society almost stopped the Iraq invasion. It delayed it for about
six months. If they decide they want to go to war with Iran (which I also
think would be a terrible, terrible mistake), I think civil society will
get up on its feet and stop a war with Iran. We have the technology today.
That would, I think be a real litmus test.
Don't feel disempowered. This a technology that empowers everybody. Don't
just let the folks at the top of the hierarchy tell you that they're going
to gather all the information and have all the power. Progressivism has
been possible in the past; it is imperative that we pursue it today.
Traces of Ourselves was the third in a series on the politics and
ethics of information technologies, co-sponsored by the Center for Science,
Technology, and Society and the Markkula Center for Applied Ethics. Other
panels from the series are:
Games With Ethics: A Panel on Video Games
Ethics and Politics of Search Engines