Show simple item record

dc.contributor.advisorHiroshi Ishii.en_US
dc.contributor.authorChung, Keywonen_US
dc.contributor.otherMassachusetts Institute of Technology. Dept. of Architecture. Program in Media Arts and Sciences.en_US
dc.date.accessioned2011-03-24T20:30:33Z
dc.date.available2011-03-24T20:30:33Z
dc.date.copyright2010en_US
dc.date.issued2010en_US
dc.identifier.urihttp://hdl.handle.net/1721.1/61943
dc.descriptionThesis (S.M.)--Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2010.en_US
dc.descriptionCataloged from PDF version of thesis.en_US
dc.descriptionIncludes bibliographical references (p. 142-145).en_US
dc.description.abstractTangible User Interfaces (TUIs) have fueled our imagination about the future of computational user experience by coupling physical objects and activities with digital information. Despite their conceptual popularity, TUIs are still difficult and time-consuming to construct, requiring custom hardware assembly and software programming by skilled individuals. This limitation makes it impossible for end users and designers to interactively build TUIs that suit their context or embody their creative expression. OnObject enables novice end users to turn everyday objects into gestural interfaces through the simple act of tagging. Wearing a sensing device, a user adds a behavior to a tagged object by grabbing the object, demonstrating a trigger gesture, and specifying a desired response. Following this simple Tag-Gesture-Response programming grammar, novice end users are able to transform mundane objects into gestural interfaces in 30 seconds or less. Instead of being exposed to low-level development tasks, users are can focus on creating an enjoyable mapping between gestures and media responses. The design of OnObject introduces a novel class of Human-Computer Interaction (HCI): gestural programming of situated physical objects. This thesis first outlines the research challenge and the proposed solution. It then surveys related work to identify the inspirations and differentiations from existing HCI and design research. Next, it describes the sensing and programming hardware and gesture event server architecture. Finally, it introduces a set of applications created with OnObject and gives observations from user participated sessions.en_US
dc.description.statementofresponsibilityby Keywon Chung.en_US
dc.format.extent146 p.en_US
dc.language.isoengen_US
dc.publisherMassachusetts Institute of Technologyen_US
dc.rightsM.I.T. theses are protected by copyright. They may be viewed from this source for any purpose, but reproduction or distribution in any format is prohibited without written permission. See provided URL for inquiries about permission.en_US
dc.rights.urihttp://dspace.mit.edu/handle/1721.1/7582en_US
dc.subjectArchitecture. Program in Media Arts and Sciences.en_US
dc.titleOnObject : programming of physical objects for gestural interactionen_US
dc.typeThesisen_US
dc.description.degreeS.M.en_US
dc.contributor.departmentProgram in Media Arts and Sciences (Massachusetts Institute of Technology)
dc.identifier.oclc707544879en_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record