Close

Making Drones Civic

Link to Conference Paper

graeff-matias_makingdronescivic_isa2015

Abstract

Can drones be fully accepted as civic technologies? Are there values embodied by drones that undermine their ability to perform in a civic capacity? What design principles might make drones more civic? Where does responsibility lie between civil society actors, drone designers, and policymakers in pursuing this goal while balancing privacy, security, and innovation? Although drones have several proposed civic use cases, particularly involving practices described as monitorial citizenship, drones are different from other civic technologies. Civic technologies are about shifting power away from corrupt actors and toward virtuous actors. And a motivating concept and ethic for civic technologies, whether used for interacting with governments or against them, is participatory practice. If we aspire to a definition of civic action that is fundamentally participatory and we hope for our civic technologies to embody that value of participatory practice, we must investigate whether drones can be fully accepted as civic technologies. This paper will address these questions and issues, problematizing the use of drones for civic purposes by defining a set of values and design principles for civic technologies and by showing where drones may play a role, situating contemporary cases among relevant political and ethical questions.

Governing the Ungovernable

Announcement

http://www.tprcweb.com/tprc42-friday-panel-sessions/

Description

Laws, norms, policies, and institutions have failed to keep up with advances in artificial intelligence. Popularly, we still think of governance of these systems using quotes and metaphors from science fiction authors. The public awareness of the sophistication and capabilities of current systems are also skewed, often in extremes: predicting robot warfare and mind control or suffering complete naivete.

The reality is that intelligent systems are embedded in more and more everyday products and services. The so-called “internet of things” represents a kind of ubiquitous computing that anticipates our needs and provides us information or adjusts the room temperature based on usage patterns. Smarter algorithms power seemingly neutral services like Google’s search engine or Facebook’s news feed.

This panel brings together domain experts researching the impact of intelligent systems in a variety of arenas including household products, civics, and cyberwarfare. The panel will explore gaps in our existing framework of regulation around these technologies, identify challenges common to the deployment of different intelligent systems in a broad range of contexts, and suggest a common set of research goals to advance the cause of effective governance, mapping out the role different constituencies can play in this effort.

Challenges for Personal Behavior Change Research on Information Diversity

Link

http://personalizedchange.weebly.com/1/post/2014/03/challenges-for-personal-behavior-change-research-on-information-diversity.html

Abstract

Researchers have tested a variety of personal informatics systems to encourage diversity in the political leaning, geography, and demographics of information sources, often with a belief in the normative value of exposure to diverse information sources. Methods attempted have included information labeling of media sources, personalized metrics of reading behavior, personalized visualization of social media behavior, recommendation systems, and social introductions. Although some of these systems demonstrate positive results for the metrics they define, substantial questions remain on the interpretation of these results and their implications for future design. We identify challenges in defining normative values of diversity, potential algorithmic exclusion for some groups, and the role of personal tracking as surveillance. Furthermore, we outline challenges for evaluating systems and defining the meaningful social impact for information diversity systems operating at scale.