Educational institutions producing graduates who design and implement the technology changing our world should be thinking deeply about how they are fostering good, responsible technologists. This paper introduces a workshop activity meant to start a conversation among faculty, staff, and students about which attributes we most want our graduates to develop and how those attributes might help them address dilemmas of technology’s negative consequences
In a June 2017 post, Mark Zuckerberg introduced a change in Facebook’s mission from “make the world more open and connected” to “give people the power to build community and bring the world closer together.” Facebook may not be able to give people power, but the goal of empowering people and building community is language familiar to civic engagement and participatory democracy, similar to the core idea of relational organizing—building interpersonal relationships that can be mobilized for collective action. In a February 2017 post, Zuckerberg first articulated this new thinking: “In times like these, the most important thing we at Facebook can do is develop the social infrastructure to give people the power to build a global community that works for all of us.” Companies like Facebook often claim to serve the public good through their products; however, this particular language and the depth of explanation in Zuckerberg’s posts imply a recognition of ethical responsibility and at least an intention to design for true citizen empowerment.
I believe it is fair to insist that if the creators of a technology platform seek to make claims about empowering users, they must set explicit design goals for citizen empowerment and evaluate their platform against those goals. Facebook continues to face steep challenges to providing equal access to its platform. To aim for communities that can be effective and serving the public good is an even loftier goal. How Facebook will know whether it is actually making progress on its mission remains to be seen. However, technology companies have a reputation for religiously articulating goals and measuring them empirically. In fact, one of the architects of the data science team at Facebook claims that they invented the term “data scientist” to describe this important role (Hammerbacher 2009).
Democracy that values citizen-centered governance requires citizen empowerment (sometimes called “civic agency”), and empowered citizens need certain skills, knowledge, attitudes, and habits that lead to effective civic engagement (Boyte 2009; Levinson 2012; Gibson and Levine 2003). Empowering experiences and learning opportunities can promote a virtuous cycle of reinforcing citizen empowerment and strengthening democracy. Spaces like town hall meetings, protest marches, the voting booth, and the civic education classroom traditionally represent where these experiences and opportunities take place. The emergence of networked digital media have created new, pervasive civic spaces—the networked public sphere. Whereas public spaces offline have seen a decline (Zick 2009), their online replacements, largely private spaces like Facebook, have grown to astounding size and influence with limited accountability to governments and the public.
Social media platforms like Facebook, government communication tools like We the People, and smaller civic technology platforms like SeeClickFix are increasingly the spaces through which citizens seek empowerment in the form of direct response from their government on key issues. As important actors in U.S. democracy (as well as other polities), the creators of these spaces have a responsibility to design for citizen empowerment and ensure they are advancing empowering processes and outcomes for citizens by evaluating whether their platforms are actually serving this mission. These creators of digital technology used for civic engagement should be understood as stewards of democracy with an ethical obligation to serve the public good.
Can drones be fully accepted as civic technologies? Are there values embodied by drones that undermine their ability to perform in a civic capacity? What design principles might make drones more civic? Where does responsibility lie between civil society actors, drone designers, and policymakers in pursuing this goal while balancing privacy, security, and innovation? Although drones have several proposed civic use cases, particularly involving practices described as monitorial citizenship, drones are different from other civic technologies. Civic technologies are about shifting power away from corrupt actors and toward virtuous actors. And a motivating concept and ethic for civic technologies, whether used for interacting with governments or against them, is participatory practice. If we aspire to a definition of civic action that is fundamentally participatory and we hope for our civic technologies to embody that value of participatory practice, we must investigate whether drones can be fully accepted as civic technologies. This paper will address these questions and issues, problematizing the use of drones for civic purposes by defining a set of values and design principles for civic technologies and by showing where drones may play a role, situating contemporary cases among relevant political and ethical questions.
To inform policy, curricula, and future research on cyberbullying through an exploration of the moral reasoning of digitally active 10–14-year olds (tweens) when witnesses to digital abuse.
Conducted interviews with 41 tweens, asking participants to react as witnesses to two hypothetical scenarios of digital abuse. Through thematic analysis of the interviews, I developed and applied a new typology for classifying “upstanders” and “bystanders” to cyberbullying.
Identified three types of upstander and five types of bystander, along with five thinking processes that led participants to react in those different ways. Upstanders were more likely than bystanders to think through a scenario using high-order moral reasoning processes like disinterested perspective-taking. Moral reasoning, emotions, and contextual factors, as well as participant gender and home school district, all appeared to play a role in determining how participants responded to cyberbullying scenarios.
Hypothetical scenarios posed in interviews cannot substitute for case studies of real events, but this qualitative analysis has produced a framework for classifying upstanding and bystanding behavior that can inform future studies and approaches to digital ethics education.
This study contributes to the literature on cyberbullying and moral reasoning through in-depth interviews with tweens that record the complexity and context-dependency of thinking processes like perspective-taking among an understudied but critical age group.
Researchers have tested a variety of personal informatics systems to encourage diversity in the political leaning, geography, and demographics of information sources, often with a belief in the normative value of exposure to diverse information sources. Methods attempted have included information labeling of media sources, personalized metrics of reading behavior, personalized visualization of social media behavior, recommendation systems, and social introductions. Although some of these systems demonstrate positive results for the metrics they define, substantial questions remain on the interpretation of these results and their implications for future design. We identify challenges in defining normative values of diversity, potential algorithmic exclusion for some groups, and the role of personal tracking as surveillance. Furthermore, we outline challenges for evaluating systems and defining the meaningful social impact for information diversity systems operating at scale.