Graeff, E. 2020. The Responsibility to Not Design and the Need for Citizen Professionalism. Computing Professionals for Social Responsibility: The Past, Present and Future Values of Participatory Design. Retrieved from https://pdc2020cpsr.pubpub.org/pub/vizamy14
I advise two programs at Olin College of Engineering that invite undergraduate students to conduct community-engaged design work. In the fall of 2019, project teams in both of those programs decided not to design systems requested by their outside collaborators based on ethical concerns about the harm they might cause. This paper briefly describes how those decisions came to be, the need to educate for and celebrate design refusal, and how this exemplifies the need to develop the next generation of designers and technologists to be citizen professionals.
In Race After Technology, Ruha Benjamin delivers a powerful synthesis of STS and critical race theory. She diagnoses a series of problems plaguing the creation and effect of design and technology, drawing throughlines from historical through contemporary examples. If pressed for time, the lengthy introduction chapter summarizes her core argument both about the existence of the “New Jim Code” and need for abolitionist tools to dismantle it and the need for analyses and lens like she offers in this book that bring together STS and critical race theory.
She offers us four dimensions to the New Jim Code: engineered inequity, default discrimination, coded exposure, and technological benevolence/beneficence, each of which is described in Chapters 1-4, respectively. As she summarizes on page 47 of the Introduction: 1) “engineered inequity explicitly works to amplify social hierarchies that are based on race, class, and gender and how the debate regarding ‘racist robots’ is framed in popular discourse”; 2) “default discrimination grows out of design processes that ignore social cleavages” and when “tech developers do not attend to the social and historical context of their work”; 3) coded exposure is about the ways that technologies enable differential visibility and surveillance that often fall along racial lines; and 4) technological benevolence interrogates the problematic efforts and claims of “tech products and services that offer fixes for social bias.”
In her last chapter, Benjamin tries to imagine what an abolitionist toolkit would require to address the New Jim Code. As primarily a book of theoretical synthesis, her toolkit leans heavily on other frameworks like the Design Justice principles and the recommendations Virginia Eubanks articulates in her essential book Automating Inequality, also it celebrates the work of those developing approaches for auditing technology like algorithms. The key work done by the last chapter though is articulating the need for new “social imaginaries.” Building on the work of Black science fiction writers, Black new feminist theory, and the prophetic speech tradition integral to the Civil Rights Movement (think “I have a dream”), Benjamin’s call for abolitionist and liberatory design and technology is one of narrative. What is the future we want to live in and how do we describe it to ourselves and those around us? How do we see ourselves bringing that future into being? How do we dramatically redefine or expand our efforts of inclusion in design practices and the expansion of who is called a “designer” to those situated in society in ways and with expertise not usually invited explicitly or implicitly into design?
While this is definitely a scholarly manuscript, it reads easily and quickly for social theory. If you are working in the areas of UX design/research, AI, tech ethics, tech criticism, technology and politics, or race and technology, this is going to be essential reading. I expect to be recommending this to my engineering students for years to come.
Monitorial citizenship is a form of civic engagement in which people collect information about their surroundings or track issues of local or personal interest in order to improve their communities and pursue justice. Common activities of the monitorial citizen include collecting information, sharing stories and insights, coordinating with networks of other civic actors, and pursuing accountability for institutions and elite individuals and their perceived responsibilities. The term originates in Michael Schudson’s 1998 book The Good Citizen. Schudson proposes monitorial citizenship as a successor to the “informed citizenship” paradigm to better account for our current age of information overload, arguing “the obligation of citizens to know enough to participate intelligently in governmental affairs be understood as a monitorial obligation” (p. 310). This original concept positioned monitorial citizens as “defensive rather than proactive” (p. 311). The idea of citizens paying attention to public affairs and serving a monitorial role pre‐dates Schudson and, of course, the Internet. What is different now is that technologies like the Internet and smartphones enable the average person to be more effective at monitoring topics of interest and powerful actors in society through the construction of distributed networks and ongoing campaigns that can leverage sophisticated narrative strategies with data to hold them to account. Some contemporary scholars believe monitorial citizenship may be one answer to revitalizing civics in an age of mistrust (Zuckerman, 2014), an effort media literacy can support.
Educational institutions producing graduates who design and implement the technology changing our world should be thinking deeply about how they are fostering good, responsible technologists. This paper introduces a workshop activity meant to start a conversation among faculty, staff, and students about which attributes we most want our graduates to develop and how those attributes might help them address dilemmas of technology’s negative consequences
Graeff, E. 2019. ‘Everyone Should Be Involved in Designing, Implementing, and Evaluating Digital Surveillance Technology.’ In Levinson, M & Fay, J, eds., Democratic Discord in Schools: Cases and Commentaries in Educational Ethics. Cambridge, MA: Harvard Education Press.
Although digital surveillance technologies may seem to be purely technical innovations, how technology is designed always also involves political choices. Technology design embodies the values of its designers and those who commission the design, as well as the values embedded in the underlying structures it often abstracts and amplifies. When digital surveillance technologies are used in schools without being subject to appropriate political discussion and contestation, they threaten democratic education in several ways. First, they impose a set of policies that affect the rights of students and parents without consulting them in their design and implementation. Second, they may chill legitimate student inquiry or even criminalize students who are researching topics or personal questions deemed taboo or dangerous according to administrators. Building on participatory design and “popular technology” principles, I thus recommend that schools involve students, parents, teachers, and administrators in collective deliberation about the design, scope, and use of digital surveillance technologies.