The use of technology platforms for civic engagement is transforming the practices and experience of democracy, as well as the processes of political empowerment and civic education. This demands that such technology be designed more consciously to support citizen empowerment and civic learning. I have developed new metrics and methods to evaluate technology platforms in terms of citizen empowerment in order to replace more common metrics that merely look at a technology’s efficiency. This can help establish citizen empowerment as a design goal for civic technology. However, empowerment is not a goal, it is a process and experience. True empowerment will require that the creation of technology be a collaboration of stakeholders, wherein power and agency throughout the design process are shared. Still, I believe that articulating a goal of and metrics for empowerment can be a helpful milestone to making civic technology in empowering ways.
Monitorial citizenship is a form of civic engagement in which people collect information about their surroundings or track issues of local or personal interest in order to improve their communities and pursue justice. Common activities of the monitorial citizen include collecting information, sharing stories and insights, coordinating with networks of other civic actors, and pursuing accountability for institutions and elite individuals and their perceived responsibilities. The term originates in Michael Schudson’s 1998 book The Good Citizen. Schudson proposes monitorial citizenship as a successor to the “informed citizenship” paradigm to better account for our current age of information overload, arguing “the obligation of citizens to know enough to participate intelligently in governmental affairs be understood as a monitorial obligation” (p. 310). This original concept positioned monitorial citizens as “defensive rather than proactive” (p. 311). The idea of citizens paying attention to public affairs and serving a monitorial role pre‐dates Schudson and, of course, the Internet. What is different now is that technologies like the Internet and smartphones enable the average person to be more effective at monitoring topics of interest and powerful actors in society through the construction of distributed networks and ongoing campaigns that can leverage sophisticated narrative strategies with data to hold them to account. Some contemporary scholars believe monitorial citizenship may be one answer to revitalizing civics in an age of mistrust (Zuckerman, 2014), an effort media literacy can support.
Educational institutions producing graduates who design and implement the technology changing our world should be thinking deeply about how they are fostering good, responsible technologists. This paper introduces a workshop activity meant to start a conversation among faculty, staff, and students about which attributes we most want our graduates to develop and how those attributes might help them address dilemmas of technology’s negative consequences
Graeff, E. 2019. ‘Everyone Should Be Involved in Designing, Implementing, and Evaluating Digital Surveillance Technology.’ In Levinson, M & Fay, J, eds., Democratic Discord in Schools: Cases and Commentaries in Educational Ethics. Cambridge, MA: Harvard Education Press.
Although digital surveillance technologies may seem to be purely technical innovations, how technology is designed always also involves political choices. Technology design embodies the values of its designers and those who commission the design, as well as the values embedded in the underlying structures it often abstracts and amplifies. When digital surveillance technologies are used in schools without being subject to appropriate political discussion and contestation, they threaten democratic education in several ways. First, they impose a set of policies that affect the rights of students and parents without consulting them in their design and implementation. Second, they may chill legitimate student inquiry or even criminalize students who are researching topics or personal questions deemed taboo or dangerous according to administrators. Building on participatory design and “popular technology” principles, I thus recommend that schools involve students, parents, teachers, and administrators in collective deliberation about the design, scope, and use of digital surveillance technologies.
Bringing together partners from the US (POPVOX) and South Africa (Grassroot), we tested a new format to co-develop metrics and civic technology in an iterative design process.
This year’s largest civic technology gathering of practitioners and researchers (#TICTeC) took a hard look at the state of civic technology (i.e. Are we now in the fourth wave? Is it a sea change movement or has civic tech lost its relevance?). Focusing on questions of measuring impact, our team, including Erhardt Graeff from Olin College and Alisa Zomer and Kelly Zhang from MIT GOV/LAB, put together a design sprint workshop with partners POPVOX (U.S) and Grassroot (South Africa). The aim of our session was to kick-start an iterative design process to integrate social impact metrics into platform design from the very beginning, not as afterthought.
What does this mean in practice?
As Erhardt discussed in recent work, impact for civic technology is often measured using basic descriptive data from the platform itself, including number of users, repeat users, time spent on the platform, etc.; however, these engagement metrics mostly fail to speak to the social impact goals at the heart of civic technology. In some cases, surveys are conducted where users rate their experiences on a platform, which can be helpful for troubleshooting user experience design problems. But answering the real question of social impact (i.e. has the platform changed outcomes, either beliefs or behavior) demands a different kind of metric.
In our workshop, we introduced an approach that starts by articulating what change you want to see—do you want to empower ordinary citizens to take a particular action (e.g., petition, protest, vote)? Or, do you want to see local government have a specific response (e.g., budget allocation, new legislation, improved service provision)?
Depending on the intended change, there is often relevant research to help in the design process. A wealth of research exists on what incentivizes or prevents citizens from engaging with their government, how to build strong grassroots movements, and also what barriers or opportunities present on the government side. Building on this knowledge, our goal is to bridge theory with design in a way that aligns with achieving and measuring social impact.
For POPVOX, a platform to improve dialogue between citizens and government in the US, they ask: How might we measure participants’ understanding of government processes, comfort engaging, and sense that their voice matters? One approach taken by Erhardt previously is to measure perceived political efficacy (or the belief that you can influence or affect political change) in his workwith SeeClickFix. He adapted a number of oft-used national survey questions for internal and external political efficacy to examine the context of city residents requesting their local governments fix the things they care about, finding correlations between government responsiveness on the platform and active users’ perceptions that their local governments were listening to them.
Grassroot, which provides low-tech, low-cost tools for grassroots organizers working primarily with low income groups in South Africa, posed the following: How might we measure the ways in which social movement organizations grow and build capacity? Their main design constraints include a consistently unresponsive government and social movement organizations that have strong initial momentum but then steeply decline.
Dividing into small groups, we tackled these design prompts to come up with relevant and feasible metrics, explain why they matter, and what data would be needed to measure them.
Sprinting for impact
Design sprinting in 70 minutes is not for the faint of heart, but we forewarned participants and the end results were a good start to a longer process. For Grassroot, some of the suggestions included looking at leadership development within organizations over time, assessing media visibility as a proxy for reach outside the movement, and measuring interactions between organizers across multiple policy issues. For POPVOX, suggestions included design features that notified citizens when officials read comments in order to build in some measure or response and trust-building.
In the coming months, both teams plan to move these pilots forward, building on the outcomes of the TICTeC workshop.