Close

The Responsibility to Not Design and the Need for Citizen Professionalism

Citation

Graeff, E. 2020. The Responsibility to Not Design and the Need for Citizen Professionalism. Computing Professionals for Social Responsibility: The Past, Present and Future Values of Participatory Design. Retrieved from https://pdc2020cpsr.pubpub.org/pub/vizamy14

Link

https://pdc2020cpsr.pubpub.org/pub/vizamy14/release/1

Abstract

I advise two programs at Olin College of Engineering that invite undergraduate students to conduct community-engaged design work. In the fall of 2019, project teams in both of those programs decided not to design systems requested by their outside collaborators based on ethical concerns about the harm they might cause. This paper briefly describes how those decisions came to be, the need to educate for and celebrate design refusal, and how this exemplifies the need to develop the next generation of designers and technologists to be citizen professionals.

Race After Technology book review

Race After Technology: Abolitionist Tools for the New Jim Code

Race After Technology: Abolitionist Tools for the New Jim Code by Ruha Benjamin
My rating: 5 of 5 stars

In Race After Technology, Ruha Benjamin delivers a powerful synthesis of STS and critical race theory. She diagnoses a series of problems plaguing the creation and effect of design and technology, drawing throughlines from historical through contemporary examples. If pressed for time, the lengthy introduction chapter summarizes her core argument both about the existence of the “New Jim Code” and need for abolitionist tools to dismantle it and the need for analyses and lens like she offers in this book that bring together STS and critical race theory.

She offers us four dimensions to the New Jim Code: engineered inequity, default discrimination, coded exposure, and technological benevolence/beneficence, each of which is described in Chapters 1-4, respectively. As she summarizes on page 47 of the Introduction: 1) “engineered inequity explicitly works to amplify social hierarchies that are based on race, class, and gender and how the debate regarding ‘racist robots’ is framed in popular discourse”; 2) “default discrimination grows out of design processes that ignore social cleavages” and when “tech developers do not attend to the social and historical context of their work”; 3) coded exposure is about the ways that technologies enable differential visibility and surveillance that often fall along racial lines; and 4) technological benevolence interrogates the problematic efforts and claims of “tech products and services that offer fixes for social bias.”

In her last chapter, Benjamin tries to imagine what an abolitionist toolkit would require to address the New Jim Code. As primarily a book of theoretical synthesis, her toolkit leans heavily on other frameworks like the Design Justice principles and the recommendations Virginia Eubanks articulates in her essential book Automating Inequality, also it celebrates the work of those developing approaches for auditing technology like algorithms. The key work done by the last chapter though is articulating the need for new “social imaginaries.” Building on the work of Black science fiction writers, Black new feminist theory, and the prophetic speech tradition integral to the Civil Rights Movement (think “I have a dream”), Benjamin’s call for abolitionist and liberatory design and technology is one of narrative. What is the future we want to live in and how do we describe it to ourselves and those around us? How do we see ourselves bringing that future into being? How do we dramatically redefine or expand our efforts of inclusion in design practices and the expansion of who is called a “designer” to those situated in society in ways and with expertise not usually invited explicitly or implicitly into design?

While this is definitely a scholarly manuscript, it reads easily and quickly for social theory. If you are working in the areas of UX design/research, AI, tech ethics, tech criticism, technology and politics, or race and technology, this is going to be essential reading. I expect to be recommending this to my engineering students for years to come.

View all my reviews

Fostering the Good, Responsible Technologist to Face any Dilemma

Link

Abstract

Educational institutions producing graduates who design and implement the technology changing our world should be thinking deeply about how they are fostering good, responsible technologists. This paper introduces a workshop activity meant to start a conversation among faculty, staff, and students about which attributes we most want our graduates to develop and how those attributes might help them address dilemmas of technology’s negative consequences

Everyone Should Be Involved in Designing, Implementing, and Evaluating Digital Surveillance Technology

Citation

Graeff, E. 2019. ‘Everyone Should Be Involved in Designing, Implementing, and Evaluating Digital Surveillance Technology.’ In Levinson, M & Fay, J, eds., Democratic Discord in Schools: Cases and Commentaries in Educational Ethics. Cambridge, MA: Harvard Education Press.

Link to Proof of my Chapter

https://www.academia.edu/43683257/Everyone_Should_Be_Involved_in_Designing_Implementing_and_Evaluating_Digital_Surveillance_Technology

Link to Book

https://www.hepg.org/hep-home/books/democratic-discord-in-schools

Excerpt

Although digital surveillance technologies may seem to be purely technical innovations, how technology is designed always also involves political choices. Technology design embodies the values of its designers and those who commission the design, as well as the values embedded in the underlying structures it often abstracts and amplifies. When digital surveillance technologies are used in schools without being subject to appropriate political discussion and contestation, they threaten democratic education in several ways. First, they impose a set of policies that affect the rights of students and parents without consulting them in their design and implementation. Second, they may chill legitimate student inquiry or even criminalize students who are researching topics or personal questions deemed taboo or dangerous according to administrators. Building on participatory design and “popular technology” principles, I thus recommend that schools involve students, parents, teachers, and administrators in collective deliberation about the design, scope, and use of digital surveillance technologies.