Close

RIT Information Technology Honors Thesis

I won an undergraduate research grant to work to build a prototype of a web platform in Ruby on Rails that would enable users to add personal annotations to government documents and share them with other users. I continued work on the prototype and wrote up the documentation in Spring 2006 for my honors thesis.

Documentation

graeff-2006-honorsthesisdocumentation

Abstract

This undergraduate Honors capstone project involves the creation of the novel web application called .GOVernator. The purpose of this social software tool is to allow users to “markup” government documents, like the Bill of Rights, with XML tags using an interface that does require in-depth knowledge of XML. The application is programmed using the Ruby on Rails framework with JavaScript and its implementation in Asynchronous JavaScript and XML (AJAX), XHTML, CSS, XML, and XSLT. The resulting program is prototype for allowing users to create an account via registration, navigate their own home page allowing them to select government documents to markup (a.k.a scrutinize), and then use a browser-based interface for adding XML tags to government documents—storing tag names to a database for later analysis. All program functionality, except for the critical markup functionality, is in place. Help documents are also still needed to guide the user through interaction with the application’s interface. The future of the project is further development (programming) of the .GOVernator web application, a case study involving 10-30 users, and a proposal to the Lab for Social Computing as an adopted project for school year following this publication of this paper.

condu.it Logo and Site Design

Designed logo and website for a new media publication at RIT.

Logo and web site home page mockups were designed in Adobe Illustrator. The logo was used on materials advertising the new publication but condu.it was never published.

Logo

10

 

Site Design Iteration 1

11

Site Design Iteration 2

12

Site Design Iteration 3

13

 

WWW / Wiki Wacky Web?: Wikis, Authority and the Public Sphere

Slides

http://www.scribd.com/embeds/76382218/content?start_page=1&view_mode=slideshow&access_key=key-21a2jzsdyjf2afkkhn7m&secret_password=597a3ml76v0tr2l0v3g(function() { var scribd = document.createElement(“script”); scribd.type = “text/javascript”; scribd.async = true; scribd.src = “http://www.scribd.com/javascripts/embed_code/inject.js”; var s = document.getElementsByTagName(“script”)0; s.parentNode.insertBefore(scribd, s); })();

Coverage

Wiki Communities in the Era of Cultural Individuation

Excerpt

“Essentially, Wikipedia provides an example of poststructuralist principles operating online—an idea impressively illustrated by the “history flow visualizations” of Wikipedia article revisions generated by Fernando B. Viegas, Martin Wattenberg, and Kushal Dave. The original analysis of Wikipedia article evolution by the team “revealed complex patterns of cooperation and conflict” (575). These stem from the community-enabling editing capabilities built-in to the “Talk” and “History” article pages, as well as the “Watch List” option available to registered users providing an alert system for vigilant writer/editors to defend the integrity of specific articles. The goal of these discursive provisions is informal oversight of content, which can be subject to “malicious editing” —one of the strongest criticisms against Wikipedia. The history flow visualizations mapped three categories of wiki article revisions: 1) editing of content on average, 2) a malicious mass deletion of content, and 3) a mass deletion replaced by obscene content. The median survival time of the first category was 90.4 minutes, which broke down to 21% of edits reducing page size, 6% reducing it by only 50 characters—such numbers primarily indicating tightened prose and the elimination of irrelevant information (579, 581). Of course this dynamism is what makes citing Wikipedia problematic. This downside—most apparent when trying to perceive Wikipedia in the vein of a traditional encyclopedia—is balanced by the fact that new content is quickly and easily added to articles as events unfold. For instance, the study refers to how within a week of the invasion of Iraq in 2003 an entry devoted to the topic was written, and had even tripled in size in a few subsequent weeks (581). The fast-responding character of the Wikipedia user community also catches and repairs mass deletions at a median delay of 2.8 minutes—1.7 minutes for those involving obscenities (579). The data produced indicates, to at least those versed in poststructuralist insights on language, that Wikipedia’s neoteric authorial/editorial community is attempting to maximize the radical functionality/medium of wiki technology—publishing, editing, and re-publishing content (with self-governing oversight) at a frequency unimaginable in other media.”

Cited: Viegas, F, Wattenberg, M, Dave, K. 2004. ‘Studying Cooperation and Conflict between Authors with History Flow Visualizations.’ Conference on Human Factors in Computing, Vienna, April 24–29.

Attempting to build the Semantic Web: The Ontological Approach

Award

This paper won the 2007 RIT Institute Writing Contest in Technical Writing.

Full Text

graeff-2004-semanticweb

Introduction

When father of the Internet Tim Berners-Lee first envisioned the World Wide Web, he imagined it as “an information space, with the goal that it should be useful not only for human-human communication, but also that machines would be able to participate and help.” (1998, Introduction, para. 1) However, what amassed was a mess of poorly formed HTML documents boasting animated GIFs and information displayed without regard for meaning or context. What Berners-Lee was wishing for, and continues to wish for, is a better World Wide Web—a Semantic Web. This ultimate realization of the Internet’s potential is something that Berners-Lee and the World Wide Web Consortium (W3C) are still working on. With millions of users and billions of documents, the web is constantly growing and evolving. The W3C hopes that it evolves into the Semantic Web—and that hope lies in something called an ontology. (Clark, 2002)