Citizen Technologist

The CTSP Blog

Rough cuts on the incredibly interesting implications of Facebook’s Reactions

By Galen Panger, CTSP | Permalink

How do we express ourselves in social media, and how does that make other people feel? These are two questions at the very heart of social media research including, of course, the ill-fated Facebook experiment. Facebook Reactions are fascinating because they are, even more explicitly than the Facebook experiment, an intervention into our emotional lives.

Let me be clear that I support Facebook’s desire to overcome the emotional stuntedness of the Like button (don’t even get me started on the emotional stuntedness of the Poke button). I support the steps the company has taken to expand the Like button’s emotional repertoire, particularly in light of the company’s obvious desire to maintain its original simplicity. But as a choice about which emotional expressions and reactions to officially reward and sanction on Facebook, they are consequential. They explicitly present the company with the knotty challenge of determining the shape of Facebook’s emotional environment, and they have wide implications for the 1.04 billion of us who visit Facebook each day. Here are a few rough reactions to Facebook Reactions.

READ MORE

The need for interdisciplinary tech policy training

By Nick Doty, CTSP, with Richmond Wong, Anna Lauren Hoffman and Deirdre K. Mulligan | Permalink

Conversations about substantive tech policy issues — privacy-by-design, net neutrality, encryption policy, online consumer protection — frequently evoke questions of education and people. “How can we encourage privacy earlier in the design process?” becomes “How can we train and hire engineers and lawyers who understand both technical and legal aspects of privacy?” Or: “What can the Federal Trade Commission do to protect consumers from online fraud scams?” becomes “Who could we hire into an FTC bureau of technologists?” Over the past month, members of the I School community have participated in several events where these tech policy conversations have occurred:

  • Catalyzing Privacy by Design: fourth in a series of NSF-sponsored workshops, organized with the Computing Community Consortium, to develop a privacy by design research agenda
  • Workshop on Problems in the Public Interest: hosted by the Technology Science Research Collaboration Network at Harvard to generate new research questions
  • PrivacyCon: an event to bridge academic research and policymaking at the Federal Trade Commission

READ MORE

Citizen Drones: delivering burritos and changing public policy

By Charles Belle, CTSP Fellow and CEO of Startup Policy Lab | Permalink

It’s official: drones are now mainstream. The Federal Aviation Administration (FAA) estimates that consumers purchased 1 million drones — or if you prefer to use the more cumbersome technical term “Unmanned Aerial Systems” (UAS) — during the past holiday season alone. Fears about how government agencies might use data collected by drones, however, have led to bans against public agencies operating drones across the country. These concerns about individual privacy are important, but they are obstructing an important discussion about the benefits drones can bring to government operations. A more constructive approach to policymaking begins by asking: how do we want government agencies to use drones?

Reticence amongst policymakers to allow public agencies to operate drones is valid. There are legitimate concerns about how government agencies will collect, stockpile, mine, and have enduring access to data collected. And to make things more complicated, the FAA has clear jurisdictional primacy, but has not set out any clear direction on future regulations. Nonetheless, policymakers and citizens should keep in mind that drones are more than just a series of challenges to privacy and “being under the thumb” of Federal agencies. Drones also offer local public agencies exciting opportunities to expand ambulatory care, deliver other government services more effectively, and support local innovation.

READ MORE

Reviewing Danielle Citron’s Hate Crimes in Cyberspace

By Richmond Wong, UC Berkeley School of Information | Permalink

Earlier this year, CTSP sponsored a reading group, in conjunction with the School of Information’s Classics Reading Group, on Danielle Citron’s Hate Crimes in Cyberspace. Citron’s book exposes problems caused by and related to cyber harassment. She provides a detailed and emotional description of the harms of cyber harassment followed by a well-structured legal argument that offers several suggestions on how to move forward. It is a timely piece that allows us to reflect on the ways that the Internet can be a force multiplier for both good and bad, and how actions online interact with the law.

Hate Crimes in Cyberspace cover imageCyber harassment is disproportionately experienced by women and other often disadvantaged groups (see Pew’s research on online harassment). Citron brings the emotional and personal toll of cyber harassment to life through three profiles of harassment victims. These victims experienced harm not only online, but in the physical world as well. Cyber harassment can destroy professional relationships and employment opportunities, particularly when a victim’s name becomes linked to harassing speech via search engines. Victims may be fired if their nude photos are publicly published. The spread of victims’ personal information such as addresses may lead dangerous and unwanted visitors to their homes.

READ MORE

Ethical Pledges for Individuals and Collectives

By Andrew McConachie | Permalink

[Ed. note: As a follow-up to Robyn’s explanation of the proposed I School Pledge, Andrew McConachie provides some challenges regarding the effectiveness of pledges, and individual vs. collective action for ethical behavior in software development. We’re pleased to see this conversation continue and welcome further input; it will also be a topic of discussion in this week’s Catalyzing Privacy-by-Design workshop in Washington, DC. —npd]

I am conflicted about how effective individualized ethics are at creating ethical outcomes, and the extent to which individuals can be held accountable for the actions of a group. The I School Pledge is for individuals to take. It asks individuals to hold themselves accountable. However, most technology/software is produced as part of a team effort, usually in large organizations. Or, in the case of most open source software, it is produced through a collaborative effort with contributors acting both as individuals, and as members of contributing organizations. The structures of these organizations and communities play a fundamental role in what kind of software gets produced (cf. Conway’s Law, which focuses on internal communications structures), and what kinds of ethical outcomes eventuate.

READ MORE

Introducing CTSP’s Inaugural Collaborative Projects & Fellows

By Galen PangerNick Doty, CTSP Co-Directors  |  Permalink

We are extremely pleased to announce the selection of 11 collaborative projects and to welcome 28 fellows to the Center for our inaugural year. We received a total of 50 proposals from over 125 collaborators and were impressed by the diversity of ideas and teams, and the potential they hold to make a difference.

The 11 projects we selected, however, stood out for their focus and ambition, for their potential to advance the state of affairs in our four areas, and for their ability to speak to larger, not solely academic, audiences. From investigating how technology can amplify existing social biases to understanding users’ views of algorithmic decision-making, these projects will tackle important issues with smart, practical approaches.

READ MORE

Should Facebook watch out for our well-being?

By Galen Panger, CTSP | Permalink

Last year, when Facebook published the results of its emotional contagion experiment, it triggered a firestorm of criticism in the press and launched a minor cottage industry within academia around the ethical gray areas of Big Data research. What should count as ‘informed consent’ in massive experiments like Facebook’s? What are the obligations of Internet services to seek informed consent when experimentally intervening in the lives, emotions and behaviors of their users? Is there only an obligation when they want to publish in academic journals? These are not easy questions.

Perhaps more importantly, what are the obligations of these Internet services to users and their well-being more broadly?
 

Facebook's Infection

Credit: ‘Facebook’s Infection’ by ksayer1

READ MORE

Miscalculating the risk of crypto “backdoors”

By Deirdre K. Mulligan, Associate Professor | Permalink | Cross-posted from the Christian Science Monitor

As Britain continues to debate its revised Investigatory Powers bill, which opponents deride as the “snoopers’ charter,” it seems increasingly evident that Prime Minister David Cameron is lockstep with many US law enforcement officials when it comes to the encryption debate.

While Mr. Cameron’s government claims that the bill doesn’t mandate so-called “backdoors” into encryption on consumer devices, the bill suggests otherwise. It currently states that communication service providers must maintain the capability to remove “electronic protection” they apply to protect communications or data.

Sounds familiar, right? FBI Director James Comey recently testified that the FBI is working with the tech sector (which has publicly opposed weakening encryption standards) to find ways to decrypt communications so that investigators can more easily access it during criminal or terrorist investigations.

READ MORE

A Pledge of Ethics for I School Graduates

By Robyn Perry, I School MIMS ’15 | Permalink

When you hear about Volkswagen engineers cheating emissions tests, or face recognition software that can’t “see” Black people, you start to wonder who is in charge here. Or more to the point, who is to blame?

Well, I just graduated from UC Berkeley’s School of Information Master of Information Management and Systems program (MIMS for short). My colleagues and I are the kind of people that are going to be making decisions about this stuff in all sorts of industries.

This post is about one way we might hold ourselves accountable to an ethical standard that we agree to by means of a pledge.

As you might imagine, we spend a significant part of our coursework thinking about how people think about technology, how people use technology, and how what we design can do a better job of not destroying the world.

READ MORE

Participants from 26 departments and organizations attend #CTSPhackathon

By Galen Panger, CTSP | Permalink

On Saturday, we welcomed 42 participants from 26 departments and organizations to the CTSP Proposal Hackathon, the Center’s first official event and the kick-off for our annual request for proposals, which opened today for applications.

READ MORE