Skip to main content
Markkula Center for Applied Ethics

Kicking Our Smartphone Addiction

iPhone

iPhone

Corporate social responsibility in tech

Ann Skeet

This article was originally published in MarketWatch on February 3, 2018.

Ann Skeet is the director of leadership ethics at the Markkula Center for Applied Ethics at Santa Clara University. Views expressed here are her own.

Most technology companies have the best of intentions about their mission and purpose.

For example, Google’s registration statement with the SEC to take the company public came with these assurances:

• Our goal is to develop services that improve the lives of as many people as possible — to do things that matter. We make our services as widely available as we can by supporting over 97 languages and by providing most services for free. Advertising is our principal source of revenue, and the ads we provide are relevant and useful rather than intrusive and annoying. We strive to provide users with great commercial information. 

• We are proud of the products we have built, and we hope that those we create in the future will have an even greater positive impact on the world.

Google also promised not to be evil and to make the world better. But who decides what is better? Whether or not an ad is intrusive or annoying?  If a technology is truly addictive? What can we expect of tech executives under increased heat to acknowledge harms their products inflict and correct for them? What moral obligations do these companies, and others like them, have to identify, publicize and address the risks using their products pose to consumers?

These questions are important now that social media has become indispensable for many people around the world, especially young people. Investors now are asking Apple to accept responsibility and take action for smartphone addiction in children.

Last month, Facebook acknowledged that social media use can be bad for users’ mental health, leaving people who passively consume information “feeling worse.” A former Facebook vice-president for user growth said, “The short-term, dopamine-driven feedback loops that we have created are destroying how society works. No civil discourse, no cooperation, misinformation, mistruth.” Even Facebook’s founding president, Sean Parker, has said he is now “something of a conscientious objector” to social media.

These companies and their leaders could take their cue from bioethicist Mildred Cho. Last spring, the Markkula Center for Applied Ethics at Santa Clara University, where I am the director of leadership ethics, hosted Cho to talk about biohacking — weekend warriors adjusting their own bodies to enhance human abilities.

In describing how they consider the risks they were taking, in this case largely to themselves, Cho implored the audience to consider their “positive obligations.” This term, useful when considering human rights and sustaining civil society, asks inventors creating things that will affect humans to consider at least these four principles:

• Promoting well-being and public benefit

• Fairness in distribution of risks and benefits

• Due care, proceeding with awareness of implications

• Public participation and input from stakeholders

The National Academy of Sciences adds its own positive obligations regarding the governance of human genome editing:

• Transparency

• Respect for persons

• Transnational cooperation: collaborative approaches while respecting different cultural contexts

• Responsible science

In this vein, investment firm Jana Partners LLC and institutional investor California State Teachers’ Retirement Systems (Calstrs) have made specific requests of Apple towards fulfilling some of these obligations. They ask for a committee of experts and specialists to research and study these issues, and for Apple to offer parents more tools to enhance age-appropriate setup options.

Apple says it’s always looked out for kids and already provides parents with such protective tools. Facebook says it is working to improve its algorithms and add humans back into the moderation mix in greater numbers to address some of the concerns raised about its product. Google, for its part, is testing a set of trust indicators for display to users and for use in their algorithms that were developed by the Trust Project, hosted by the Markkula Center, according to Sally Lehrman, who led the effort at the Ethics Center. (Google was a funder of this project.)

Are these efforts sufficient? The fact that “impact investing” has now hit mainstream players like Jana and Calstrs suggests that a responsibility to society broadly is a force that must be considered in assessing long-term value creation in companies. Leaders in the tech-world C-suites can say they have done enough, set worthy missions, and made good-faith attempts to be positive forces for change. Company directors, with their own duty of care obligations, should also pay attention.

Committing to meet these principles of positive obligations to society is one way for the tech industry to be proactive, and ultimately to keep Big Tech from being compared to Big Tobacco.

(AP Photo/Rogelio V. Solis, File)

Feb 9, 2018
--

Subscribe to Our Blogs

* indicates required
Subscribe me to the following blogs: