A Pledge of Ethics for I School Graduates
When you hear about Volkswagen engineers cheating emissions tests, or face recognition software that can’t “see” Black people, you start to wonder who is in charge here. Or more to the point, who is to blame?
Well, I just graduated from UC Berkeley’s School of Information Master of Information Management and Systems program (MIMS for short). My colleagues and I are the kind of people that are going to be making decisions about this stuff in all sorts of industries.
This post is about one way we might hold ourselves accountable to an ethical standard that we agree to by means of a pledge.
As you might imagine, we spend a significant part of our coursework thinking about how people think about technology, how people use technology, and how what we design can do a better job of not destroying the world.
What is a MIMS degree, you ask? At the I School, we go into many different careers, and many of them have names that aren’t yet familiar to the average person in the way that lawyer or biologist are. Our job titles edge towards the obscure, the new, the bizarre. Take for example UX Designer (less cryptically known as User Experience Designer), Product Manager, or Growth Hacker. (Join me in snickering at the last one.)
As commencement speaker at graduation in May, I wanted to crystallize what my fellow graduates and I spent all this time thinking about into an actual commitment. So I delivered this speech and ended it by asking my fellow graduates to take a pledge. This pledge is inspired by similar oaths in related disciplines. You may have heard of the Hippocratic Oath for doctors. The Order of the Engineer wears a ring to remind themselves of an oath they take known as the Obligation. As I learned in Applied Behavioral Economics, it’s hard to beat wearing a physical object as a commitment device. However, making a pledge together, to each other, and in public, seemed like a good place to start.
Here’s the text of the pledge. You may recognize the model I used for its basis. It’s short but pithy.
On my honor, I will try:
To serve people and society,
to create technology for important human needs,
and to abide by the I School pledge.
responsible for what I design and code.
I will do my best to be
inquisitive and not condescending ,
ever mindful of privacy and security ,
to combat technological determinism ,
to remember that artifacts have politics ,
to beware the power of defaults ,
use and abide by technical standards ,
design with the tussle in mind ,
and be accountable for the things I create in the world .
Hopefully, the first several of lines speak for themselves. For some clarity about each of the next lines, the following ought to help:
- We pledged to be inquisitive and not condescending. Anyone who remembers the skits featuring “Nick Burns, your company’s computer guy” from Saturday Night Live can recall the caricature of the condescending tech geek. The School of Information student body is diverse, but we all have a level of technical expertise that may endow us with — deserved or undeserved — cultural clout in our jobs and even in our lives. In addition to this popular stereotype of the Tech Geek as a condescending jerk who we have to put up with in exchange for tech support, anyone with technical expertise has some degree of power in choosing how they use language and translate concepts for their audience. Here, I intended for us to pledge to use curiosity as a first attack to a problem, rather than leading with establishing how much we know.
- Being mindful of privacy and security is part of our coursework in the MIMS program. Two of our core courses, Information Law and Policy and Distributed Computing Applications and Infrastructure, acquaint us with the complexities of ensuring privacy and security in designing and developing technical systems. These are incredibly difficult challenges, and in some cases, students enroll in the School of Information specifically to learn to affect policy changes in these areas.
- Technological determinism is fancy terminology for the belief that technology follows an inevitable course, and every new development is “progress”. It’s believing that the course it takes is out of our hands. We pledged to remember that it doesn’t have a mind of its own — we have the power to make choices about it. In the words of Tarleton Gillespie in The Relevance of Algorithms: “A sociological analysis…must unpack the warm human and institutional choices that lie behind these cold mechanisms.” Many sophisticated thinkers have written brilliantly about technological determinism. See Winner, Heilbroner, Cockburn, Gillespie, and others for more.
- “Artifacts have politics” is a classic line, swiped directly from Langdon Winner’s eponymous piece. We learn at the I School that the choice to introduce particular technologies may be politically or economically motivated in ways that disenfranchise certain groups. Winner describes the introduction of the tomato harvester that wiped out the small California tomato farm, and the pneumatic molding machine at Chicago’s McCormick factory that dealt a blow to the skilled laborers who were organizing the union. The choice of technological systems is a product of contextual cultural, social, and political pressures, combined with technological developments. The same can be said of every other piece of technology you might encounter. Winner says it better.
- A default is any predetermined setting on a device or technical system that may be changed. We learn about the incredible power of defaults in Applied Behavioral Economics — most people don’t change the out-of-the-box default settings that technologies come with! So whatever settings the designers and developers choose to be default will influence the experience of most people who use their technology. FTC settlements have turned on defaults (as in FTC v. Frostwire), as we learn in Information Law and Policy.
- We pledged to design in accordance with technical standards — seriously, this is not the time to nod off. Your life would be a lot more of a sweaty struggle without widely used technical standards. They are the reason you can charge an appliance in any socket trusting that it won’t explode, or type coherently on any US computer, or make sense of information on the Internet. Standards help you and your date arrive at the same place at a pre-established time. Information Organization and Retrieval and Technology and Delegation do their part to drum this into our little skulls.
- Perhaps the most jargony part of the pledge, the “tussle” is a frequent buzzword in conversations around South Hall. Nothing beats reading the paper it comes from by Clark, Sollins, Braden, and Wroclawski. But in essence, the tussle is the “ongoing contention between parties with conflicting interests.” This plays out in the design and development of technology and information systems. There is a tussle anywhere that a design process must negotiate competing goals. Not only must we understand the competition, but we should design to preserve the tension, rather than to resolve it in favor of one party or another.
- And indeed, to seal the deal, it was fitting that we remembered that we are responsible for the things we create in the world. Technology and information systems are not neutral and never have been. The person who says it best is Mike Monteiro in his speech about How Designers Destroyed the World. Sometimes, the more invisible a system or technology is, the more influence it can have in our lives. We pledged to remember that whatever we create may have unintended consequences, and we need to tread carefully and be prepared to take responsibility for our work, just as healthcare providers, engineers, and other professionals do.
The pledge is becoming a living document. I’m going to go out on a limb and suggest that my first rendition isn’t perfection. Do you notice something important missing from it? What do you think the value is of pledging to each other? Should we take our commitments further? If so, how?
We welcome your comments and discussion towards a pledge that represents the highest ethics in the information profession.