Title Image

Blog

The CTSP Blog

Backstage Decisions, Front-stage Experts: Non-technical Experiences and the Political Engagement of Scientists

by Santiago Molina and Gordon PherriboCTSP Fellows

This is the second in a series of posts on the project “Democratizing” Technology: Expertise and Innovation in Genetic Engineering.

See the first post in the series: Backstage Decisions, Front-stage Experts: Interviewing Genome-Editing Scientists.

Since 2015, scientists, ethicists, and regulators have attempted address the ethical, moral, and social concerns involving genetic modifications to the human germline. Discourse involving these concerns focused on advancing a culture of responsibility and precaution within the scientific community, rather than the creation of new institutional policies and laws. Confidence in scientist’s ability to self-regulate has become increasingly tenuous with the recent news of the birth of genome-edited twins on November 26th, 2018, despite the scientific consensus that such experiments are medically and ethically unwarranted. In response, journalists, social scientists and critical researchers in the life sciences have posed the question: Who should be involved in deciding how genome-editing technologies should be used and for what aims?

In this post, we complicate the idea that technical expertise, which is usually narrowly defined on the basis of professional experience or knowledge, should be the main criteria for having a seat at the table during scientific decision-making. Drawing from eight interviews with scientists who participated in a small meeting held in Napa Valley in 2015, we highlight the role of non-technical experiences in shaping scientists’ views of decision-making about genome editing.

We identify three experiences that have influenced scientists’ views and deliberations about the application and potential consequences and benefits of genetic engineering technologies: 1) reading and group discussions outside of their academic disciplines, 2) direct engagement with patient communities, and 3) involvement in social movements. To wrap up, we make some modest suggestions for what these might mean in the context of STEM education.

1. Reading Outside of the Discipline and Group Discussions.

During our interviews we asked scientists how they shaped their viewpoints about biotechnology and its relationship to society. Respondents described their exposure to new viewpoints and reflected on the effect this exposure had on their decision-making. One of the sources of these exposures was reading outside of their academic discipline. We were surprised to hear about how the work of philosophers of science and sociologists of science did inform the decision making of one senior scientist at the Napa Valley meeting. This faculty member discussed their interest in finding opportunities to supplement their laboratory training with philosophical discussions about issues tangential to the science they were working on. With other graduate students, they created a small group that met regularly to discuss concepts and theories in philosophy of science, ethics and sociology of science.

We met- I don’t remember whether it was once a month or once every two weeks to discuss issues around the philosophy and societal issues of science. So we would find books, read books, um from you know – from Bertrand Russell, the philosopher, to Jacob Bronowski to Alfred Lord Whitehead, you know books on the philosophy and the applications of science,

The scientist described that this work added additional layers to their understanding of societal issues related to science. Even though this reading group was instrumental in developing his own awareness of the relationship between science and broader social, political and cultural issues, this respondent also lamented how the opportunity to delve into topics outside of a graduate student’s normal routine, “was not encouraged by any of [their] mentors.” This theme came up in several of our interviews, reinforcing the importance of mentors in shaping how scientists make meaning of their discipline in relation to society, and what educational and professional development opportunities graduate students feel comfortable pursuing outside of their formal training.

2. Direct engagement through service.

The most distinctly communicated experiences our interviewees engaged in outside of their formal training were service-related learning experiences that involved direct interaction with communities that would medically benefit from the technology. These experiences appeared to give individuals a greater sense of civic responsibility, and afforded them a more expansive understanding of the relationship between their work and broader communities. For genome-editing researchers, this crucially meant being aware of the social and medical realities of patients that might be research subjects in clinical trials for CRISPR-based therapies.

In our interviews, scientists had direct engagement with people outside of their discipline within national scientific boards, federal organizations, health clinics, and the biotech and pharmaceutical industry. These types of experiences provide an opportunity to collaborate with stakeholders on pressing issues, learn and benefit from industry and market knowledge, and ensure that the outcome of decisions are both relevant and meaningful to community stakeholders outside of the lab.

One of our respondents reflected on how they learned important skills, such as active listening, through professional experiences with indigenous patient communities–which helped this respondent better serve the community’s needs.

I’ve learned a whole lot from the patients I’ve taken care of and the people I’ve met. I certainly learned a great deal from going to the Navajo reservation. I’m – just to be able to sit down in a very different culture and listen and I think it’s very important for doctors to listen to their patients.

This interviewee was additionally committed to modeling the listening behavior of physicians and teaching these listening skills to others. When we further asked “What what do you think was specific about the way that [your mentors] spoke with patients and interacted with them?” the interviewee responded with clarity:        

Sitting back and not speaking and letting them talk about what’s important to them.

The interviewee conveyed that if you listen, people will tell you what is most important to them. They further argued that as decision-makers guiding the usage of far-reaching technologies, it is important to not make assumptions about what a particular community needs.

Similarly, in another interview, a molecular biologist described their experience setting up clinical trials and discussing the risks and benefits of an experimental treatment. This experience not only gave them a more concrete sense of what was at stake in the discussions held at the Napa Meeting, but also helped sensitize them towards the lived experiences of the patient communities that may be affected (for better or worse) by genome editing technology. When asked if experiences during their doctoral program, postdoc or work at a biotech firm, had prepared them for discussing genome editing and its implications, the molecular biologist responded:

Having been involved in therapeutic programs in which you’re discussing the pluses and minuses of therapies that can have side effects can prepare you for that. […] To me that was very helpful because it was a very concrete discussion. That conversation was not a like, “oh, I’m an academic and I wanna write a paper and someone’s going to read it and then enough.” […] [In a therapeutic program] the conversation was like, “we have a molecule, are we going to put it in people?” And if the answer is “yes,” like there is a living person on the other end that is going to take that molecule and [they are] going to have to live with the consequences positive and negative. […] 

            The distinction being drawn here between scientific work with concrete outcomes for people and work with solely academic outcomes, suggests that there are practical experiences that researchers at Universities may only have indirect knowledge of that are important for understanding how the products of science may affect others. As the interviewee further explained, the stakes of being unfamiliar with patient’s experiences are particularly high,

[My work at a biotech firm] has sort of prepared me at least a little bit for some of the discussion around therapeutic editing because different patient populations have wildly different ideas about gene editing. There are certain forms of inherited blindness where people are frankly insulted that you would call it a genetic disease, right? And I think rightly so. That’s their experience. It’s their disease. Why should we call this something that should be quote-unquote “corrected,” right?

In this case, prior experience with clinical trials alerted the researcher towards the heterogeneity of experiences of different patient populations. They further described how, in other interactions with patient advocates through public engagement, they were able to learn a great deal about the uniqueness of each patient group and their different views about genome editing. Here, the researcher additionally conveyed concern over the ableism that is often implicit in medical views of difference. They recounted how listening to perspectives from different patient communities led them to reflect on how procedurally safe genome editing can still cause harm in other ways.

3. Involvement in social movements.

The third non-technical form of expertise came from researchers’ political participation. While the recent fervor against the GOP’s “war on science” may give us ample evidence that politics and science don’t mix well, the role of social movements in the creation of scientific knowledge has been extensively documented by sociologists. For example, post World War II environmental movements changed the content, form and meaning of ecological research (Jamison 2006) and Gay Rights and AIDS activists helped steer the direction of biomedical research (Epstein 1996). What is less emphasized in these studies though, is how participation in social movements by scientists can impact their worldview and decision-making. When asked what personal experiences shaped how they thought of the process of decision-making around new biotech, one interviewee mentioned their engagement with political movements in the late 1960’s during anti-Vietnam War protests :

So I was in Berkeley in the late 60s…This is a time of a lot of social activity. Protests that went on against the Vietnam War in favor of civil rights. There was a lot of protest activity going on and I was involved in that to some extent, you know, I went on marches. I went door-to-door one summer in opposition to the Vietnam War…Um, so I had to you know- I had sort of a social equity outlook on life. All the way from my upbringing from college- and then at Berkeley you really couldn’t avoid being involved in some of these social issues.

This respondent went on to discuss how their commitments towards social equity shaped their decision-making around emerging technologies. In another interview, a respondent described how taking time off of their graduate program to work on a local election campaign motivated them to participate in science policy forums later in their career.

However, these example also suggests that how a scientist chooses to engage with social movements can have lasting effects on how they think of themselves as being a part of a larger community. If scientists participate unreflexively, social movements can fail to challenge individual’s to consider how the network building and activism they are doing affects themselves and may be excluding others from different communities.

To give a contemporary example, the March for Science (MfS) movement in January of 2017 protested against the Trump administration’s anti-science policies and actions. While the issues about science funding were urgent, MfS organizers failed to address language issues in MfS that were dismissive of the experience of marginalized communities in science. Whether or not a participant in MfS chose to critically engage in the movement, will influence how this individual sees the world and whether they intentionally or unintentionally reproduce inequities in science. By asking scientists to think about both their role in society and about the community of science itself, social movements provide a large quantity of knowledge and creativity that scientists can contribute to and use as a resource when making decisions and reflecting on the implications of emerging technologies.

The Value of Non-technical Expertise in Training

Many of the experiences that shaped our interviewees decision-making occurred during their early graduate and professional training. Despite the personal and professional value they found in these experiences, our interviewees noted the lack of support from their graduate mentors in their exploration of non-technical interests and a lack of incentives to participate more broadly in political endeavors during their training. While this may be changing for newer generations of scientists, this raises questions about how scientists in the natural and physical sciences are socialized into the broader scientific community, and the impact of that socialization on what they think of their political responsibilities are.

For example, a consensus study of the National Academies of Sciences, Engineering, and Medicine (2018) found that there is a lack of social and institutional support for activities located outside of the traditional realm of an individual’s discipline and argued for the creation of novel training pathways that could lead to holistic STEM research training. One way of creating more holistic STEM training programs noted by the study that our findings support would be to provide resources and structures to facilitate the connection between graduate training in the life sciences and fields, such as STS, sociology and philosophy. Exposure to these disciplines can help aspiring researchers grapple with the social interactions of their discipline and serve as additional tools for constructive debates around scientific issues. Promoting interdisciplinary collaboration may also help reduce stigma associated with non-traditional pathways to scientific training and provide easier channels to integrate professional development and internship opportunities into the curriculum.

The urgency of this current gap in training is apparent if you look at who is currently at the the decision making table. The committees and meetings for deliberation about the social and ethical issues of genome editing are almost exclusively constituted by senior scientists. These researchers are mainly conscripted into these roles because of their technical expertise and status in disciplinary networks. Historically, the academic institutions these scientists were trained in were not built to prepare scientists for making political decisions or for appreciating the social complexity and nuance that comes with the introduction of emergent technologies into society. In our third blog post we will explore the political stakes of this form of science governance, which are surprisingly high.


References:

Epstein, S. (1996). Impure science: AIDS, activism, and the politics of knowledge (Vol. 7). Univ of California Press.

Jamison, A. (2006). Social movements and science: Cultural appropriations of cognitive praxis. Science as Culture, 15(01), 45-59.

National Academies of Sciences, Engineering, and Medicine (2018) Graduate STEM Education for the 21st Century. Washington, DC: The National Academies Press. doi: https://doi.org/10.17226/25038.

Symposium: “Governing Machines – Defining and Enforcing Public Policy Values in AI Systems”

CTSP is proud to be a co-sponsor of  the 23rd Annual BCLT/BTLJ Symposium: Governing Machines: Defining and Enforcing Public Policy Values in AI Systems

Algorithms that analyze data, predict outcomes, suggest solutions, and make decisions are increasingly embedded into everyday life. Machines automate content filtering, drive cars and fly planes, trade stocks, evaluate resumes, assist with medical diagnostics, and contribute to government decision-making. Given the growing role of artificial intelligence and machine learning in society, how should we define and enforce traditional legal obligations of privacy, non-discrimination, due process, liability, professional responsibility, and reasonable care?

This symposium will convene scholars and practitioners from law, policy, ethics, computer science, medicine, and social science to consider what roles we should allow machines to play and how to govern them in support of public policy goals.

Co-sponsored by: CTSP, the Center for Long-Term Cybersecurity, and the Algorithmic Fairness and Opacity Working Group (AFOG) at UC Berkeley.

Bonus!

Two 2017 CTSP fellows will be panelists:

  • Amit Elazari on “Trust but Verify – Validating and Defending Against Machine Decisions”
  • Uri Hacohen on “Machines of Manipulation”

Using Crowdsourcing to address Disparities in Police Reported Data: Addressing Challenges in Technology and Community Engagement

This is a project update from a CTSP project from 2017: Assessing Race and Income Disparities in Crowdsourced Safety Data Collection (with Kate BeckAditya Medury, and Jesus M. Barajas)

Project Update

This work has led to the development of Street Story, a community engagement tool that collects street safety information from the public, through UC Berkeley SafeTREC.

The tool collects qualitative and quantitative information, and then creates maps and tables that can be publicly viewed and downloaded. The Street Story program aims to collect information that can create a fuller picture of transportation safety issues, and make community-provided information publicly accessible.

 

The Problem

Low-income groups, people with disabilities, seniors and racial minorities are at higher risk of being injured while walking and biking, but experts have limited information on what these groups need to reduce these disparities. Transportation agencies typically rely on statistics about transportation crashes aggregated from police reports to decide where to make safety improvements. However, police-reported data is limited in a number of ways. First, crashes involving pedestrians or cyclists are significantly under-reported to police, with reports finding that up to 60% of pedestrian and bicycle crashes go unreported. Second, some demographic groups, including low-income groups, people of color and undocumented immigrants, have histories of contentious relationships with police. Therefore, they may be less likely to report crashes to the police when they do occur. Third, crash data doesn’t include locations where near–misses have happened, or locations where individuals feel unsafe but an issue has not yet happened. In other words, the data allow professionals to react to safety issues, but don’t necessarily allow them to be proactive about them.

One solution to improve and augment the data agencies use to make decisions and allocate resources is to provide a way for people to report transportation safety issues themselves. Some public agencies and private firms are developing apps and websites whether people can report issues for this purpose. But one concern is that the people who are likely to use these crowdsourcing platforms are those who have access to smart phones or the internet and who trust that government agencies with use the data to make changes, biasing the data toward the needs of these privileged groups.

Our Initial Research Plan

We chose to examine whether crowdsourced traffic safety data reflected similar patterns of underreporting and potential bias as police-reported safety data. To do this, we created an online mapping tool that people could use to report traffic crashes, near-misses and general safety issues. We planned to work with a city to release this tool to and collected data from the general public, then work directly with a historically marginalized community, under-represented in police-reported data, to target data collection in a high-need neighborhood. We planned to reduce barriers to entry for this community, including meeting the participants in person to explain the tool, providing them with in-person and online training, providing participants with cell phones, and compensating their data plans for the month. By crowdsourcing data from the general public and from this specific community, we planned to analyze whether there were any differences in the types of information reported by different demographics.

This plan seemed to work well with the research question and with community engagement best practices. However, we came up against a number of challenges with our research plan. Although many municipal agencies and community organizations found the work we were doing interesting and were working to address similar transportation safety issues we were focusing on, many organizations and agencies seemed daunted by the prospect of using technology to address underlying issues of under-reporting. Finally, we found that a year was not enough time to build trusting relationships with the organizations and agencies we had hoped to work with. Nevertheless, we were able to release a web-based mapping tool to collect some crowdsourced safety data from the public.

Changing our Research Plan

To better understand how more well-integrated digital crowdsourcing platforms perform, we pivoted our research project to explore how different neighborhoods engage with government platforms to report non-emergency service needs. We assumed some of these non-emergency services would mirror the negative perceptions of bicycle and pedestrian safety we were interested in collecting via our crowdsourcing safety platform. The City of Oakland relies on SeeClickFix, a smartphone app, to allow residents to request service for several types of issues: infrastructure issues, such as potholes, damaged sidewalks, or malfunctioning traffic signals; and non-infrastructure issues such as illegal dumping or graffiti. The city also provides phone, web, and email-based platforms for reporting the same types of service requests. These alternative platforms are collectively known as 311 services. We looked at 45,744 SeeClickFix-reports and 35,271 311-reports made between January 2013 and May 2016. We classified Oakland neighborhoods by status as community of concern. In the city of Oakland, 69 neighborhoods meet the definition for communities of concern, while 43 do not. Because we did not have data on the characteristics of each person reporting a service request, we made the assumption that people reporting requests also lived in the neighborhood where the request was needed.

How did communities of concern interact with the SeeClickFix and 311 platforms to report service needs? Our analysis highlighted two main takeaways. First, we found that communities of concern were more engaged in reporting than other communities, but had different reporting dynamics based on the type of issue they were reporting. About 70 percent of service issues came from communities of concern, even though they represent only about 60 percent of the communities in Oakland. They were nearly twice as likely to use SeeClickFix than to report via the 311 platforms overall, but only for non-infrastructure issues. Second, we found that even though communities of concern were more engaged, the level of engagement was not equal for everyone in those communities. For example, neighborhoods with higher proportions of limited-English proficient households were less likely to report any type of incident by 311 or SeeClickFix.

Preliminary Findings from Crowdsourcing Transportation Safety Data

We deployed the online tool in August 2017. The crowdsourcing platform was aimed at collecting transportation safety-related concerns pertaining to pedestrian and bicycle crashes, near misses, perceptions of safety, and incidents of crime while walking and bicycling in the Bay Area. We disseminated the link to the crowdsourcing platform primarily through Twitter and some email lists. . Examples of organizations who were contacted through Twitter-based outreach and also subsequently interacted with the tweet (through likes and retweets) include Transform Oakland, Silicon Valley Bike Coalition, Walk Bike Livermore, California Walks, Streetsblog CA, and Oakland Built. By December 2017, we had received 290 responses from 105 respondents. Half of the responses corresponded to perceptions of traffic safety concerns (“I feel unsafe walking/cycling here”), while 34% corresponded to near misses (“I almost got into a crash but avoided it”). In comparison, 12% of responses reported an actual pedestrian or bicycle crash, and 4% of incidents reported a crime while walking or bicycling. The sample size of the responses is too small to report any statistical differences.

Figure 1 shows the spatial patterns of the responses in the Bay Area aggregated to census tracts. Most of the responses were concentrated in Oakland and Berkeley. Oakland was specifically targeted as part of the outreach efforts since it has significant income and racial/ethnic diversity.

Figure 1 Spatial Distribution of the Crowdsourcing Survey Responses

Figure 1 Spatial Distribution of the Crowdsourcing Survey Responses

 

In order to assess the disparities in the crowdsourced data collection, we compared responses between census tracts that are classified as communities of concern or not. A community of concern (COC), as defined by the Metropolitan Transportation Commission, a regional planning agency, is a census tract that ranks highly on several markers of marginalization, including proportion of racial minorities, low-income households, limited-English speakers, and households without vehicles, among others.

Table 1 shows the comparison between the census tracts that received at least one crowdsourcing survey response. The average number of responses received in COCs versus non-COCs across the entire Bay Area were similar and statistically indistinguishable. However, when focusing on Oakland-based tracts, the results reveal that average number of crowdsourced responses in non-COCs were statistically higher. To assess how the trends of self-reported pedestrian/cyclist concerns compare with police-reported crashes, an assessment of pedestrian and bicycle-related police-reported crashes (from 2013-2016) shows that more police-reported pedestrian/bicycle crashes were observed on an average in COCs across the Bay Area as well as in Oakland. The difference in trends observed in the crowdsourced concerns and police-reported crashes suggest that either walking/cycling concerns are greater in non-COCs (thus underrepresented in police crashes), or that participation from among COCs is relatively underrepresented.

Table 1 Comparison of crowdsourced concerns and police-reported pedestrian/bicycle crashes in census tracts that received at least 1 response

Table 1 Comparison of crowdsourced concerns and police-reported pedestrian/bicycle crashes in census tracts that received at least 1 response

Table 2 compares the self-reported income and race/ethnicity characteristics of the respondents with the locations where the responses were reported. For reference purposes, Bay Area’s median household income in 2015 was estimated to be $85,000 (Source: http://www.vitalsigns.mtc.ca.gov/income), and Bay Area’s population was estimated to be 58% White, per the 2010 Census, (Source: http://www.bayareacensus.ca.gov/bayarea.htm).

Table 2 Distribution of all Bay Area responses based on the location of response and the self-reported income and race/ethnicity of respondents

The results reveal that White, medium-to-high income respondents were observed to report more walking/cycling -related safety issues in our survey, and more so in non-COCs. This trend is also consistent with the definition of COCs, which tend to have a higher representation of low-income people and people of color. However, if digital crowdsourcing without widespread community outreach is more likely to attract responses from medium-to-high income groups, and more importantly, if they only live, work, or play in a small portion of the region being investigated, the aggregated results will reflect a biased picture of a region’s transportation safety concerns. Thus, while the scalability of digital crowdsourcing provides an opportunity for capturing underrepresented transportation concerns, it may require greater collaboration with low-income, diverse neighborhoods to ensure uniform adoption of the platform.

Lessons Learned

From our attempts to work directly with community groups and agencies and our subsequent decision to change our research focus, we learned a number of lessons:

  1. Develop a research plan in partnership with communities and agencies. This would have allowed us to ensure that we began with a research plan in which community groups and agencies were better able to partner with us on, and this would have ensured that the partners were on board the topic of interest and the methods we hoped to use.
  2. Recognize the time it takes to build relationships. We found that building relationships with agencies and communities was more time intensive and took longer that we had hoped. These groups often have limitations on the time they can dedicate to unfunded projects. Next time, we should plan for this in our initial research plan.
  3. Use existing data sources to supplement research. We found that using See-Click-Fix and 311 data was a way to collect and analyze information to add context to our research question. Although the data did not have all demographic information we had hoped to analyze, this data source added additional context to the data we collected.
  4. Speak in a language that the general public understands. We found that when we used the term self-reporting, rather than crowdsourcing, when talking to potential partners and to members of the public, these individuals were more willing to consider the use of technology to collect information on safety issues from the public as legitimate. Using vocabulary and phrasing that people are familiar with is crucial when attempting to use technology to benefit the social good.

CTSP Alumni Updates

We’re thrilled to highlight some recent updates from our fellows:

Gracen Brilmyer, now a PhD student at UCLA, has published a single authored work in one of the leading journals in archival studies, Archival Science: “Archival Assemblages: Applying Disability Studies’ Political/Relational Model to Archival Description” and presented their work on archives, disability, and justice at a number of events over the past two years, including The Archival Education and Research Initiative (AERI), the Allied Media Conference, the International Communications Association (ICA) Preconference, Disability as Spectacle, and their research will be presented at the upcoming Community Informatics Research Network (CIRN).

CTSP Funded Project 2016: Vision Archive


Originating in the 2017 project “Assessing Race and Income Disparities in Crowdsourced Safety Data Collection” done by Fellows Kate Beck, Aditya Medury, and Jesus Barajas, the Safe Transportation and Research Center will launch a new project, Street Story, in October 2018. Street Story is an online platform that allows community groups and agencies to collect community input about transportation collisions, near-misses, general hazards and safe locations to travel. The platform will be available throughout California and is funded through the California Office of Traffic Safety.

CTSP Funded Project 2017: Assessing Race and Income Disparities in Crowdsourced Safety Data Collection


Fellow Roel Dobbe has begun a postdoctoral scholar position at the new AI Now Institute. Inspired by his 2018 CTSP project, he has co-authored a position paper with Sarah Dean, Tom Gilbert and Nitin Kohli titled A Broader View on Bias in Automated Decision-Making: Reflecting on Epistemology and Dynamics.

CTSP Funded Project 2018: Unpacking the Black Box of Machine Learning Processes


We are also looking forward to a CTSP Fellow filled Computer Supported Cooperative Work conference in November this year! CTSP affiliated papers include:

We also look forward to seeing CTSP affiliates presenting other work, including 2018 Fellows Richmond Wong, Noura Howell, Sarah Fox, and more!

 

October 25th: Digital Security Crash Course

Thursday, October 25, 5-7pm, followed by reception

UC Berkeley, South Hall Room 210

Open to the public!

RSVP is required.

Understanding how to protect your personal digital security is more important than ever. Confused about two factor authentication options? Which messaging app is the most secure? What happens if you forget your password manager password, or lose the phone you use for 2 factor authentication? How do you keep your private material from being shared or stolen? And how do you help your friends and family consider the potential dangers and work to prevent harm, especially given increased threats to vulnerable communities and unprecedented data breaches?

Whether you are concerned about snooping family and friends, bullies and exes who are out to hack and harass you, thieves who want to impersonate you and steal your funds, or government and corporate spying, we can help you with this fun, straightforward training in how to protect your information and communications.

Join us for a couple hours of discussion and hands-on set up. We’ll go over various scenarios you might want to protect against, talk about good tools and best practices, and explore trade offs between usability and security. This training is designed for people at all levels of expertise, and those who want both personal and professional digital security protection.

Refreshments and hardware keys provided! Bring your laptop or other digital device. Take home a hardware key and better digital security practices.

This crash course is sponsored by the Center for Technology, Society & Policy and generously funded by the Charles Koch Foundation. Jessy Irwin will be our facilitator and guide. Jessy is Head of Security at Tendermint, where she excels at translating complex cybersecurity problems into relatable terms, and is responsible for developing, maintaining and delivering comprehensive security strategy that supports and enables the needs of her organization and its people. Prior to her role at Tendermint, she worked to solve security obstacles for non-expert users as a strategic advisor, security executive and former Security Empress at 1Password. She regularly writes and presents about human-centric security, and believes that people should not have to become experts in technology, security or privacy to be safe online.

RSVP here!

Backstage Decisions, Front-stage Experts: Interviewing Genome-Editing Scientists

by Santiago Molina and Gordon PherriboCTSP Fellows

This is the first in a series of posts on the project “Democratizing” Technology: Expertise and Innovation in Genetic Engineering.

See the second post in the series: Backstage Decisions, Front-stage Experts: Non-technical Experiences and the Political Engagement of Scientists

When we think about who is making decisions that will impact the future health and wellbeing of society, one would hope that these individuals would wield their expertise in a way that addresses the social and economic issues affecting our communities. Scientists often fill this role: for example, an ecologist advising a state environmental committee on river water redistribution [1], a geologist consulting for an architectural team building a skyscraper [2], an oncologist discussing the best treatment options based on the patient’s diagnosis and values [3] or an economist brought in by a city government to help develop a strategy for allocating grants to elementary schools. Part of the general contract between technical experts and their democracies is that they inform relevant actors so that decisions are made with the strongest possible factual basis.

The three examples above describe scientists going outside of the boundaries of their disciplines to present for people outside of the scientific community “on stage” [4]. But what about decisions made by scientists behind the scenes about new technologies that could affect more than daily laboratory life? In the 1970s, genetic engineers used their technical expertise to make a call about an exciting new technology, recombinant DNA (rDNA). This technology allowed scientists to mix and add DNA from different organisms; later giving rise to engineered bacteria that could produce insulin and eventually transgenic crops. The expert decision making process and outcome, in this case, had little to do with the possibility of commercializing biotechnology or the economic impacts of GMO seed monopolies. This happened before the patenting of whole biological organisms [5], and the use of rDNA in plants in 1982. Instead, the emerging issues surrounding rDNA were dealt with as a technical issue of containment. Researchers wanted to ensure that anything tinkered with genetically stayed not just inside the lab, but inside specially marked and isolated rooms in the lab, eventually given rise to well-established institution of biosafety. A technical fix, for a technical issue.

Today, scientists are similarly engaged in a process of expert decision making around another exciting new technology, the CRISPR-Cas9 system. This technology allows scientists to make highly specific changes, “edits”, to the DNA of virtually any organism. Following the original publication that showed that CRISPR-Cas9 could be used to modify DNA in a “programmable” way, scientists have developed the system into a laboratory toolbox and laboratories across the life sciences are using it to tinker away at bacteria, butterflies, corn, frogs, fruit flies, human liver cells, nematodes, and many other organisms. Maybe because most people do not have strong feelings about nematodes, most of the attention in both popular news coverage and in expert circles about this technology has had to do with whether modifications that could affect human offspring (i.e. germline editing) are moral.  

We have been interviewing faculty members directly engaged in these critical conversations about the potential benefits and risks of new genome editing technologies. As we continue to analyze these interviews, we want to better understand the nature of these backstage conversations and learn how the experiences and professional development activities of these expects influenced their decision-making. In subsequent posts we’ll be sharing some of our findings from these interviews, which so far have highlighted the role of a wide range of technical experiences and skills for the individuals engaged in these discussions, the strength of personal social connections and reputation in getting you a seat at the table and the dynamic nature of expert decision making.

[1]  Scoville, C. (2017). “We Need Social Scientists!” The Allure and Assumptions of Economistic Optimization in Applied Environmental Science. Science as Culture, 26(4), 468-480.

[2] Wildermuth and Dineen (2017) “How ready will Bay Area be for next Quake?” SF Chronicle. Available online at: https://www.sfchronicle.com/news/article/How-ready-will-Bay-Area-be-for-next-big-quake-12216401.php

[3] Sprangers, M. A., & Aaronson, N. K. (1992). The role of health care providers and significant others in evaluating the quality of life of patients with chronic disease: a review. Journal of clinical epidemiology, 45(7), 743-760.

[4] Hilgartner, S. (2000). Science on stage: Expert advice as public drama. Stanford University Press.

[5] Diamond v Chakrabarty was in 1980, upheld first whole-scale organism patent (bacterium that could digest crude oil).

Standing up for truth in the age of disinformation

Professor Deirdre K. Mulligan and PhD student (and CTSP Co-Director) Daniel Griffin have an op-ed in The Guardian considering how Google might consider its human rights obligations in the face of state censorship demands: If Google goes to China, will it tell the truth about Tiananmen Square?

The op-ed advances a line of argument developed in a recent article of theirs in the Georgetown Law Technology Review: “Rescripting Search to Respect the Right to Truth”

Social Impact Un-Pitch Day 2018

On Thursday, October 4th at 5:30pm the Center for Technology, Society & Policy (CTSP) and the School of Information’s Information Management Student Association (IMSA) are co-hosting their third annual Social Impact Un-Pitch Day!

Join CTSP and IMSA to brainstorm ideas for projects that address the challenges of technology, society, and policy. We welcome students, community organizations, local municipal partners, faculty, and campus initiatives to discuss discrete problems that project teams can take on over the course of this academic year. Teams will be encouraged to apply to CTSP to fund their projects.

Location: Room 202, in South Hall.

RSVP here!

Agenda

  • 5:40 Introductions from IMSA and CTSP
  • 5:45 Example Projects
  • 5:50 Sharing Un-Pitches

We’ve increased the time for Un-Pitches! (Still 3-minutes per Un-Pitch)

  • 6:40 Mixer (with snacks and refreshments)

 

Un-Pitches

Un-Pitches are meant to be informal and brief introductions of yourself, your idea, or your organization’s problem situation. Un-pitches can include designing technology, research, policy recommendations, and more. Students and social impact representatives will be given 3 minutes to present their Un-Pitch. In order to un-pitch, please share 1-3 slides, as PDF and/or a less than 500-word description—at this email: ctsp@berkeley.edu. You can share slides and/or description of your ideas even if you aren’t able to attend. Deadline to share materials: midnight October 1st, 2018.

Funding Opportunities

The next application round for fellows will open in November. CTSP’s fellowship program will provide small grants to individuals and small teams of fellows for 2019. CTSP also has a recurring offer of small project support.

Prior Projects & Collaborations

Here are several examples of projects that members of the I School community have pursued as MIMS final projects or CTSP Fellow projects (see more projects from 2016, 2017, and 2018).

 

Skills & Interests of Students

The above projects demonstrate a range of interests and skills of the I School community. Students here and more broadly on the UC Berkeley campus are interested and skilled in all aspects of where information and technology meets people—from design and data science, to user research and information policy.

RSVP here!

August 30th, 5:30pm: Habeas Data Panel Discussion

Location: South Hall Rm 202

Time: 5:30-7pm (followed by light refreshments)

CTSP’s first event of the semester!

Co-Sponsored with the Center for Long-Term Cybersecurity

Please join us for a panel discussion featuring award-winning tech reporter Cyrus Farivar, whose new book, Habeas Data, explores how the explosive growth of surveillance technology has outpaced our understanding of the ethics, mores, and laws of privacy. Habeas Data explores ten historic court decisions that defined our privacy rights and matches them against the capabilities of modern technology. Mitch Kapor, co-founder, Electronic Frontier Foundation, said the book was “Essential reading for anyone concerned with how technology has overrun privacy.”

The panel will be moderated by 2017 and 2018 CTSP Fellow Steve Trush, a MIMS 2018 graduate and now a Research Fellow at the Center for Long-Term Cybersecurity (CLTC). He was on a CTSP project starting in 2017 that provided a report to the Oakland Privacy Advisory Commission—read an East Bay Express write-up on their work here.

The panelists will discuss what public governance models can help local governments protect the privacy of citizens—and what role citizen technologists can play in shaping these models. The discussion will showcase the ongoing collaboration between the UC Berkeley School of Information and the Oakland Privacy Advisory Commission (OPAC). Attendees will learn how they can get involved in addressing issues of governance, privacy, fairness, and justice related to state surveillance.

Panel:

  • Cyrus Farivar, Author, Habeas Data: Privacy vs. the Rise of Surveillance Tech
  • Deirdre Mulligan, Associate Professor in the School of Information at UC Berkeley, Faculty Director, UC Berkeley Center for Law & Technology
  • Catherine Crump, Assistant Clinical Professor of Law, UC Berkeley; Director, Samuelson Law, Technology & Public Policy Clinic.
  • Camille Ochoa, Coordinator, Grassroots Advocacy; Electronic Frontier Foundation
  • Moderated by Steve Trush, Research Fellow, UC Berkeley Center for Long-Term Cybersecurity

The panel will be followed by a reception with light refreshments. Building is wheelchair accessible – wheelchair users can enter through the ground floor level and take the elevator to the second floor.

This event will not be taped or live-streamed.

RSVP here to attend.

 

Panelist Bios:

Cyrus [“suh-ROOS”] Farivar is a Senior Tech Policy Reporter at Ars Technica, and is also an author and radio producer. His second book, Habeas Data, about the legal cases over the last 50 years that have had an outsized impact on surveillance and privacy law in America, is out now from Melville House. His first book, The Internet of Elsewhere—about the history and effects of the Internet on different countries around the world, including Senegal, Iran, Estonia and South Korea—was published in April 2011. He previously was the Sci-Tech Editor, and host of “Spectrum” at Deutsche Welle English, Germany’s international broadcaster. He has also reported for the Canadian Broadcasting Corporation, National Public Radio, Public Radio International, The Economist, Wired, The New York Times and many others. His PGP key and other secure channels are available here.

Deirdre K. Mulligan is an Associate Professor in the School of Information at UC Berkeley, a faculty Director of the Berkeley Center for Law & Technology, and an affiliated faculty on the Center for Long-Term Cybersecurity.  Mulligan’s research explores legal and technical means of protecting values such as privacy, freedom of expression, and fairness in emerging technical systems.  Her book, Privacy on the Ground: Driving Corporate Behavior in the United States and Europe, a study of privacy practices in large corporations in five countries, conducted with UC Berkeley Law Prof. Kenneth Bamberger was recently published by MIT Press. Mulligan and  Bamberger received the 2016 International Association of Privacy Professionals Leadership Award for their research contributions to the field of privacy protection.

Catherine Crump: Catherine Crump is an Assistant Clinical Professor of Law and Director of the Samuelson Law, Technology & Public Policy Clinic. An experienced litigator specializing in constitutional matters, she has represented a broad range of clients seeking to vindicate their First and Fourth Amendment rights. She also has extensive experience litigating to compel the disclosure of government records under the Freedom of Information Act. Professor Crump’s primary interest is the impact of new technologies on civil liberties. Representative matters include serving as counsel in the ACLU’s challenge to the National Security Agency’s mass collection of Americans’ call records; representing artists, media outlets and others challenging a federal internet censorship law, and representing a variety of clients seeking to invalidate the government’s policy of conducting suspicionless searches of laptops and other electronic devices at the international border.

Prior to coming to Berkeley, Professor Crump served as a staff attorney at the ACLU for nearly nine years. Before that, she was a law clerk for Judge M. Margaret McKeown at the United States Court of Appeals for the Ninth Circuit.

Camille Ochoa: Camille promotes the Electronic Frontier Foundation’s grassroots advocacy initiative (the Electronic Frontier Alliance) and coordinates outreach to student groups, community groups, and hacker spaces throughout the country. She has very strong opinions about food deserts, the school-to-prison pipeline, educational apartheid in America, the takeover of our food system by chemical companies, the general takeover of everything in American life by large conglomerates, and the right to not be spied on by governments or corporations.

Data for Good Competition — Showcase and Judging

The four teams in CTSP’s Facebook-sponsored Data for Good Competition will be presenting today in CITRIS and CTSP’s Tech & Data for Good Showcase Day. The event will be streamed through Facebook Live on the CTSP Facebook page. After deliberations from the judges, the top team will receive $5000 and the runner-up will receive $2000.

Final Results:

Agenda:

Data for Good Judges:

Joy Bonaguro, Chief Data Officer, City and County of San Francisco

Joy Bonaguro the first Chief Data Officer for the City and County of San Francisco, where she manages the City’s open data program. Joy has spent more than a decade working at the nexus of public policy, data, and technology. Joy earned her Masters from UC Berkeley’s Goldman School of Public Policy, where she focused on IT policy.

Lisa García Bedolla, Professor, UC Berkeley Graduate School of Education and Director of UC Berkeley’s Institute of Governmental Studies

Professor Lisa García Bedolla is a Professor in the Graduate School of Education and Director of the Institute of Governmental Studies. Professor García Bedolla uses the tools of social science to reveal the causes of political and economic inequalities in the United States. Her current projects include the development of a multi-dimensional data system, called Data for Social Good, that can be used to track and improve organizing efforts on the ground to empower low-income communities of color. Professor García Bedolla earned her PhD in political science from Yale University and her BA in Latin American Studies and Comparative Literature from UC Berkeley.

Chaya Nayak, Research Manager, Public Policy, Data for Good at Facebook

Chaya Nayak is a Public Policy Research Manager at Facebook, where she leads Facebook’s Data for Good Initiative around how to use data to generate positive social impact and address policy issues. Chaya received a Masters of Public Policy from the Goldman School of Public Policy at UC Berkeley, where she focused on the intersection between Public Policy, Technology, and Utilizing Data for Social Impact.

Michael Valle, Manager, Technology Policy and Planning for California’s Office of Statewide Health Planning and Development

Michael D. Valle is Manager of Technology Policy and Planning at the California Office of Statewide Health Planning and Development, where he oversees the digital product portfolio. Michael has worked since 2009 in various roles within the California Health and Human Services Agency. In 2014 he helped launch the first statewide health open data portal in California. Michael also serves as Adjunct Professor of Political Science at American River College.

Judging:

As detailed in the call for proposals, the teams will be judged on the quality of their application of data science skills, the demonstration of how the proposal or project addresses a social good problem, their advancing the use of public open data, all while demonstrating how the proposal or project mitigates potential pitfalls.