Title Image

Projects 2017

Collaborative Projects from Fellows @ CTSP

Below are the collaborative projects for the 2016-2017 cycle. Learn more about proposing a collaborative project and becoming a fellow at CTSP.

The Bioethics of Elective Oocyte Cryopreservation

Fellow: Allyn Benintendi

Blog: Bodily Integrity in the Age of Dislocated Human Eggs

Until 2012 oocyte cryopreservation was an unregulated scientific and technological endeavor. Following the American Society for Reproductive Medicine’s lift on its experimental label in 2012, an industry has been wielded for ‘fertility preservation’ — commercial egg storage for those willing to pay to postpone pregnancy. The global implications of this industry are far reaching. Human eggs have become currency for females who sell their eggs for research, those who sell or are trafficked to infertile reproductive tourists, and now, in the eugenic underbelly of companies selling ‘fertility preservation’ across the world. What are the factors that motivate women to donate/sell/store their eggs, and make them vulnerable to an industry that is trying to attract them? Are the industry agents managing global businesses of medical migrations, and medical transplant tourism? This research explores the policy implications of the unregulated human egg economy, and the implications of using technological interventions in women’s bodies to mitigate social problems.

Actuarial Justice in the Twenty First Century

Fellows: Johann KoehlerGil Rothschild

Court actors increasingly rely on statistical predictions about an accused’s future behaviors to inform judgments, as part of an administrative style called “actuarial justice.” Increasingly, actuarial instruments now inform every decision point, such that actuarial justice and the logic of data science has become an organizing principle for the administration of penality as a whole. Technological shifts in the way we collect and analyze data give rise to important questions that concern the kinds of knowledge that legitimate penal practice—with far-reaching implications for individuals, communities, and society at large. Actuarial justice, it has been argued, tends to view individuals as not much more than representations of a statistical population. And once populations become the direct object of state action, the very meaning of individual rights becomes potentially destabilized and open for contestation. This project will ask how actuarial justice developed, inquire as to its scope, and explore its promises and pitfalls.

Assessing Race and Income Disparities in Crowdsourced Safety Data Collection

Fellows: Kate BeckAditya MeduryJesus M. Barajas

Planners and public agencies are increasingly turning to crowdsourcing to gather data about travel patterns, which help decision makers prioritize pedestrian and bicycle improvements. Concerned road users have extended the idea to their own websites to allow others to record “close calls” or other perceptions of danger while walking and cycling. But a major concern about using crowdsourced data to understand safety concerns is one of bias. Apps that work in real time require access to smartphones and the knowledge and desire to participate in data-gathering activities, which may exclude many low-income individuals, Blacks, and Latinos. We will test how a crowdsourcing app can be used to collect data about perceived pedestrian and bicycle safety and whether there are biases in reporting, working with an underserved community in the Bay Area. We expect to inform planners and public agencies about the equity implications of using crowdsourced data for allocating resources.

Research on this project informed the creation of UC Berkeley’s Safe Transportation Research and Education Center (SafeTREC)’s Street Story: A Platform for Community Engagement.

Project writeup: Using Crowdsourcing to address Disparities in Police Reported Data: Addressing Challenges in Technology and Community Engagement

Increasing Transparency into the Capabilities of Surveillance and Policing Technologies: A Resource for Cities and Citizens

Fellows: Shazeda AhmedPeter RowlandSteve TrushEmily Witt

Affiliates: Kimberly Fong, Michelle Carney

Over the past decade many cities, including Oakland, have installed a growing range of surveillance technologies under the guise of preventing crime and terrorism and improving law enforcement response rates. As cities undergo these often imperceptible changes to enhance policing capabilities and introduce ‘smart city’ initiatives that may infringe upon civil liberties, it is imperative for citizens to have a reliable resource to turn to for understanding the functions and implications of these technologies. The Oakland Privacy Advisory Commission (OPAC) is making strides to provide accountability and transparency into the procurement and use of these technologies. In collaboration with OPAC and local civil liberties organizations, we plan to create an online resource to serve as a model for providing citizens and local government officials with necessary information to advocate for and make informed policy decisions about privacy-conscious acquisition and deployment of these technologies in Oakland and other jurisdictions.

Press: East Bay Express

Video: YouTube

Report to City of Oakland Privacy Advisory Commission — An Assessment of Potential Privacy Problems of the Consolidate Records Information Management System (pp. 4-11)

Preparing for Blockchain: Policy Implications and Challenges for the Financial Industry

Fellows: Ritt KeeratiChloe Brown

Blog: Preparing for Blockchain

Whitepaper: Preparing for Blockchain

Blockchain―a distributed ledger technology that maintains a continuously-growing list of records―is an emerging technology that has captured the imagination and investment of Silicon Valley and Wall Street. The technology originally propelled the emergence of virtual currencies such as Bitcoin, and now holds promise to revolutionize a variety of industries including, most notably, the financial sector. This project endeavors to understand potential policy implications of blockchain technology, particularly as it relates to the financial industry. It seeks to evaluate the current trends of the technology and its adoptions, the benefits and risks of the technology for consumers and industry participants, and relevant regulatory considerations and potential policy responses. The project aspires to help policymakers address their strategic planning and risk-assessment needs to prepare for the rise of blockchain technology in the near future.

Cultivating Ethical Silicon Valley Cultures through Diversity of Narratives

Fellows: Morgan G. AmesJenna BurrellAnne Jonas

We analyze the stories about “how I got into computers” that professionals in computer science and engineering tell one another. We document common themes in these stories that have motivated this culture’s dominant groups for the last several decades, explore the reasons alternative stories have not gained traction, and look for generational shifts in these stories as the technology they are based on has changed substantially to be more pervasive, complex, networked, and media-rich. Our goal is to neutralize the symbolic power of masculine, white, middle-class computer science culture and to introduce more inclusive narratives. By identifying elements of the culture which inspire some in the field but exclude others, we aim to balance one-sided ideas about who ‘should’ be in the technology industry, and what roles they are ‘meant’ to play. Having more diverse voices in computer science and engineering will cultivate more ethical STEM practitioners, researchers, and educators.

Tools and methods for inferring demographic bias in social media datasets

Fellow: Sam Maurer

Social media posts from smartphones are an increasingly useful data source for researchers and policymakers. For example, place-based posts can help city planners assess how infrastructure or public space is being used, and the needs of different communities. But it’s important to know who is represented in these data streams and who may be missing. This project develops practical tools and methods for inferring demographic biases. The basic approach is to use rule-based algorithms to determine the neighborhoods where frequent posters live, and then compare the demographic characteristics of these places with the population at large. This is particularly effective for identifying biases in characteristics like race, income, or education that vary substantially across neighborhoods. The research code will be adapted into a robust, open-source Python package.

I Regret To Inform You That Your Private Information Has Been Compromised

Fellow: Naniette H. Coleman

Affiliates: Amanda Lee, Tiffany Lo, Andrew Yang

Fellow Naniette Coleman and her team will expand the information available and accessible to the public on privacy, data protection and cybersecurity. Specifically, the team will collaborate with the Wiki-Ed Foundation and the Berkeley American Cultures Librarian to establish a “Wikipedia Student Working Group on Privacy Literacy” at Berkeley and hold regular Wiki-Edit-A-Thons. In addition, the team will work to build an interdisciplinary privacy community at Berkeley, including hosting a privacy focused workshop for academics as a part of the “Engaging Contradictions: Research, Action, and Justice” series with the American Cultures Center, Haas Center for a Fair and Inclusive Society, Public Service Center, and the Graduate Assembly. Finally, the team will also launch a website highlighting the innovative work being done in the privacy, data protection, and cybersecurity space.

Mapping Sites of Politics: Values at Stake in Mitigating Toxic News

Fellow: Daniel Griffin

Many have questioned the influence “fake news,” a poorly defined concept, may have had in the 2016 US presidential election. There have been calls for major internet platforms to intervene with technical countermeasures. While one may be sympathetic to these calls, there are reasons to believe that various approaches proposed for remedying this situation may act to censor or chill online discourse, separate online communities, destroy legitimate debate, unintentionally legitimize unwarranted truth-claims, and create centralized points of media control. Such byproducts will likely be largely the result of incomplete imaginings. This project will inform and support the design and critique of proposals and approaches to mitigate “fake news” by inserting values and a long-term view into the conversation. The project will develop metrics from a framework of democratic values implicated in proposed countermeasures and run scenario thinking and value-fictions workshops to explore implications of proposals and test the values framework.

A Penny for Their Thoughts – An Empirical Study of Social Media User’s Awareness to Rights in Uploaded Creations

Fellows: Amit Elazari Bar OnUri Hacohen

Affiliate: Talia Schwartz-Maor

Are users aware of the extent of the license each one grants to the social platform when they upload their creations? And if they are—do the users change their behavior? In the information age, users-creators are engaged in an endless quid-pro-quo, where users “pay” for “free” services with their intellectual creations, under the platforms’ terms-of-use. In our project, we conduct an empirical research to understand if users of popular social platforms such as Instagram, YouTube, Twitter and Reddit are unaware of the scope of license and waiver each of them grant to ‎exploit and modify their uploaded creations. ‎We aim to ‎evaluate users’ attitudes to what has become arguably the broadest waiver of copyright ‎rights in the history of humanity. We hypothesize that the disclosure given under the platforms’ contracts is ineffective, and if users were aware to the scope of rights they waive, they would change their behaviors.

Banner Photo Credit: “UC Berkeley South Hall” by I School IMSA is licensed under CC BY 2.0