Action Path is location-based survey platform for Android smartphones that crowdsources feedback from citizens in a way that fosters civic learning through reflective political practice. Existing platforms for civic engagement, whether online or offline, are inconvenient and disconnected from the source of issues they are meant to address. They require that citizens leave the places they normally inhabit physically or virtually and commit to a separate space and set of processes. Action Path is designed to answer the challenge: How do you address barriers to effective engagement in community projects, and ensure all citizens can have their voice heard on how to improve their local communities? It does so by converting individual actions into collective action and by providing context and a sense of efficacy, which may help citizens become more effective through regular practice and feedback.
Related Talks and Publications
- Graeff, E. 2014. ‘Crowdsourcing as Reflective Political Practice: Building a Location-based Tool for Civic Learning and Engagement.’ Presented at Internet, Politics, and Policy 2014: Crowdsourcing for Politics and Policy, Oxford Internet Institute, Oxford, UK, Sep 26.
- Graeff, E. 2014. ‘Action Path: a location-based tool for civic reflection and engagement.’ S.M. Thesis, Massachusetts Institute of Technology.
- Graeff, E. 2014. ‘Action Path: A Location-Based Tool for Civic Reflection and Engagement.’ To be presented at Place, (Dis)Place and Citizenship, Wayne State University, Detroit, MI, Mar 22.
Research Assistant at the MIT Center for Civic Media in partnership with the Berkman Center for Internet & Society at Harvard, studying how a major media controversy changes over time and through the involvement of different actors in its media ecosystem, December 2009 – March 2012.
Controversy Mapper at civic.mit.edu
Details of Work
- Lead authored a case study of Trayvon Martin controversy from spring 2012
- Advanced controversy mapper network research methodology using HITS algorithm to score the authority of media sources
- Normalized and visualized multiple, disparate sources of media content along a time series to chart ebb and flow of story
- Presented findings in multiple venues
- Prepared slides for presentation of findings by PI on multiple occasions
Re-wrote Ruby Twitter scraper to be more efficient and designed and implemented a normalized database schema for storing tweets on Web Ecology Project’s server.
Original Twitter scraper prototype was coded in perl by Ethan Zuckerman (mentioned in a blog post on Apr 13, 2009). The script used Twitter’s URL-based API (since changed) to scrape tweets from a simple search query on a particular term like a hashtag. The script was ported to Ruby by Web Ecology Project member Dave Fisher, who also set up the initial database.
I re-wrote Dave’s code to make the scraper more efficient in how it handled the initial scraping of tweets and in writing to the database. I also designed a database schema for the tweets which organized the various metadata connected to each tweet in specific tables and columns that could be indexed for faster and easier queries across Web Ecology Project’s growing dataset.
My code was used to collect the tweets used in two studies I co-authored, Detecting Sadness in 140 Characters and Afghanistan and its Election on Twitter, and in another study authored by my Web Ecology Project colleagues, The Influentials.
2008 Summer Internship at Aptivate, an international IT development firm.
My primary project for the summer was re-deploying Aptivate’s online manual Web Design Guidelines for Low Bandwidth. I updated the Guidelines’ content, and re-wrote the html and css code to be cleaner and more efficient. Simultaneously, I maintained the Guidelines bugtracker, developed a marketing plan for the re-deployed Guidelines, and outlined a possible book version.
Secondarily, I worked on the css code for a web site for Aptivate’s client Helvetas.
End of Internship Presentation
I won an undergraduate research grant to work to build a prototype of a web platform in Ruby on Rails that would enable users to add personal annotations to government documents and share them with other users. I continued work on the prototype and wrote up the documentation in Spring 2006 for my honors thesis.