Privacy and Innovation
Must we choose between preserving our privacy and encouraging innovation?
At a Business and Organizational Ethics Partnership meeting, two speakers addressed this question in a panel titled "Internet Ethics – Privacy vs. Innovation: A False Choice?"
Irina Raicu, Internet Ethics Program manager at the Ethics Center, described online privacy as an "intractable problem" and noted that privacy and innovation are often viewed as being in opposition.
"I think that's a false choice. Privacy is a necessary precursor for innovation," Raicu said.
The issue isn't confined to the online world: Cameras on public buses, mannequins with cameras in their eyes tracking customer behavior in a store, and e-readers that track what and how we read, all record what used to be relatively anonymous behavior.
Some may ask, "What's the harm?" And indeed, big data has benefits. For example, Google was able to track the spread of the flu using information from user searches before the CDC was able to use reports from doctors and other medical professionals. An app that uses data about where people are currently driving and their intended destinations could help users avoid traffic – saving time, money and even lives.
Still, it's easier than ever for companies to determine customers' demographic information online, which can harm the customer. "We still have race, gender, and age discrimination," Raicu said.
Even when no discrimination takes place, Raicu said, people may be less creative if they know their every move is being tracked – and fear of what may be done with the data could chill speech and research.
Companies that gather and analyze unprecedented amounts of information about consumer habits and demographics are starting to realize that they need to regulate the use of the data themselves, or state and federal legislators may start doing it for them.
"There are definitely very important benefits to be garnered," Raicu said. "But we can't let the benefits run rampant without looking at the drawbacks as well."
For an on-the-ground look at conflicts between privacy and innovation, Raicu turned to Travis LeBlanc.
LeBlanc is special assistant attorney general of California. In addition to advising the state attorney general on significant appellate and constitutional matters, he oversees the California Department of Justice's work on technology, high-tech crime, privacy, anti-trust, and health care issues.
LeBlanc said privacy and innovation are not mutually exclusive.
"They're both values that we have, and they work together to constrain each other," LeBlanc said. "We're seeing that the innovations of today are transforming our society, and they're fundamentally transforming the way we think about privacy."
California is in a unique position to explore this intersection, LeBlanc said, because privacy is a right enshrined in the state constitution, and the state is also at the forefront of technological innovation.
LeBlanc said the traditional way of enforcing privacy rights – starting with legislation, moving to administrative enforcement and ending with litigation if necessary – moves too slowly for the digital age. The iPhone is less than 6 years old, yet it has fundamentally transformed the way people use computing devices.
"The velocity of our innovation is outpacing the inertia of our regulatory system," LeBlanc said. "We are finding ourselves as government regulators having to fundamentally rethink how we do what we do."
In addition to bringing in specialists in privacy and technology, both to advise on new legislation and to determine how current privacy laws apply to new technology, LeBlanc said the attorney general's office is finding it useful to collaborate with industry. An agreement with the larger companies that sell the vast majority of mobile apps, for example, resulted in clearer, user-friendly explanations of the apps' privacy policies. It is hoped that this approach will create a de-facto industry standard for all app development organizations.
LeBlanc outlined best practices that act as the key values for protecting both privacy and innovation for developers of mobile apps:
- Don't be greedy. The best practices from the attorney general's office recommend that developers limit data collection solely to information needed to make the app function appropriately.
- Allow users to exercise their individual autonomy. Allow consumers to adjust the privacy controls that mobile app developers build into their products.
- Be accountable. "A business must take responsibility for what activity is facilitated by its online presence," LeBlanc said. Outsourcing a function does not absolve the company of responsibility.
Raicu previously worked as an attorney in private practice. She is a graduate of Santa Clara University's School of Law and has a master's degree from San Jose State University.
LeBlanc previously worked for the U.S. Department of Justice, Williams & Connolly LLP, and Keker & Van Nest LLP, as well as clerking for a federal appeals court judge. He is a graduate of Yale Law School and also holds an MPA from the John F. Kennedy School of Government at Harvard University and an LL. M. in International Law from the University of Cambridge.
Margaret Steen is a freelance author.
Feb 1, 2013
Ethics in the News
An international meeting brings together business leaders and ethicists from around the globe.
How much information should the social media giant share?
The Trust Project strives to help the public identify reliable sources of journalism.