Ethical Pledges for Individuals and Collectives

Ethical Pledges for Individuals and Collectives

By Andrew McConachie | Permalink

[Ed. note: As a follow-up to Robyn’s explanation of the proposed I School Pledge, Andrew McConachie provides some challenges regarding the effectiveness of pledges, and individual vs. collective action for ethical behavior in software development. We’re pleased to see this conversation continue and welcome further input; it will also be a topic of discussion in this week’s Catalyzing Privacy-by-Design workshop in Washington, DC. —npd]

I am conflicted about how effective individualized ethics are at creating ethical outcomes, and the extent to which individuals can be held accountable for the actions of a group. The I School Pledge is for individuals to take. It asks individuals to hold themselves accountable. However, most technology/software is produced as part of a team effort, usually in large organizations. Or, in the case of most open source software, it is produced through a collaborative effort with contributors acting both as individuals, and as members of contributing organizations. The structures of these organizations and communities play a fundamental role in what kind of software gets produced (cf. Conway’s Law, which focuses on internal communications structures), and what kinds of ethical outcomes eventuate.

The I School Pledge can still be relevant to software development. To some extent we all exhibit free will, and our actions impact the world around us. On the other hand, our actions, especially the available options for actions we can perform, are limited by the environment we operate in. The subject of shared vs. individualized responsibility has kept moral philosophers busy since at least the invention of writing, and it’s not going to be resolved in this post. My intention instead is to tease out some of the specific nuances of this subject as it relates to software development and the role of the I School Pledge therein.

The line of the pledge which calls for technologists to be “inquisitive and not condescending” is especially relevant to the individual creator. This is something that we as contributors could probably all do better at. It’s a behavior that someone who cares about ethical outcomes can express by themselves, and that I believe can engender better outcomes.

However, if we’re asking technologists to be accountable for the artifacts they create, we’re not asking for a personal conduct change. We’re asking them to not create artifacts that they deem unethical. We’re asking them to say “No!” sometimes, based not on original intention, but on eventual ethical outcomes that may not even reveal themselves in a design phase. Putting aside the inability of developers to predict future uses of what they design, we’re still asking a lot of software developers here.

In labor terms, if their contribution to a software artifact contributes to their livelihood, we’re asking them to put that livelihood at risk. We’re asking them to engage in a work stoppage, or a strike. And one thing we know about strikes from history is that they are rarely effective if carried out by individuals.

History gives us few examples of effective individualized work stoppages. Almost all effective work stoppages of the past have been effective because workers got together, organized in some fashion, and collectively stopped working. Without organization and collective action, an individual worker is too easily replaced with another that is more compliant.

Mike Monteiro’s example of designing Facebook’s privacy settings (How Designers Destroyed the World) is a good example of a request for ineffective individual action. Facebook intentionally obfuscates their privacy settings; it’s part of their business model. It is the designer’s job to make them incomprehensible. If a designer doesn’t do that, they’re not doing their job, and they’ll be replaced. A significant percentage of designers would have to say “No!”, for any action to be effective. Another example is Volkswagen’s attempt at defeating EPA emissions standards. This was a coordinated effort by numerous people. It would have required a significant percentage of capable programmers collectively refusing to produce something.

Naming and shaming individual contributors responsible for immoral software artifacts produced by a group is a waste of time. It also distracts us from reasonable conversations about how Facebook’s business model aligns with user privacy, or how Volkswagen’s business model aligns with environmental destruction.

The requirement of collective action is why I believe it is important to socialize discussions of ethics and technology. Programmers need to become more comfortable with these discussions, so that when a programmer is asked to create something they deem unethical, they’re comfortable speaking out about it, to colleagues, to their boss, to everyone. We can’t expect every person who sees something unethical to become a whistleblower. However, by socializing this discussion we make it easier for individuals to converse with their colleagues about the ethical implications of their work. From this community perspective, Mike Monteiro’s talk is an exercise in generating the discourse necessary to address ethical issues in interface design.

This post was meant to problematize the proposed I School Pledge on the axis of individualized vs. collective responsibility. It offers no solutions, and I do not believe any single comprehensive solution exists. It’s not clear to me where an individualized ethical pledge makes sense, and where we should look to collective action for change. Creating software is work, whether we do it for fun, or for money. Thus, the question of when an individual is responsible for their work product, or how best to maximize the creation of ethical work products, should include both the vast philosophical literature on the matter, and the history of the labor movement’s impact on the values embedded in made artifacts.

See also:

No Comments

Post a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.