Show simple item record

dc.contributor.advisorV. Michael Bove, Jr.en_US
dc.contributor.authorNovy, Daniel E. (Daniel Edward)en_US
dc.contributor.otherMassachusetts Institute of Technology. Department of Architecture. Program in Media Arts and Sciences.en_US
dc.date.accessioned2013-11-18T19:22:14Z
dc.date.available2013-11-18T19:22:14Z
dc.date.copyright2013en_US
dc.date.issued2013en_US
dc.identifier.urihttp://hdl.handle.net/1721.1/82430
dc.descriptionThesis (S.M.)--Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2013.en_US
dc.descriptionCataloged from PDF version of thesis.en_US
dc.descriptionIncludes bibliographical references (p. 77-79).en_US
dc.description.abstractImmersion is an oft-quoted but ill-defined term used to describe a viewer or participant's sense of engagement with a visual display system or participatory media. Traditionally, advances in immersive quality came at the high price of ever-escalating hardware requirements and computational budgets. But what if one could increase a participant's sense of immersion, instead, by taking advantage of perceptual cues, neuroprocessing, and emotional engagement while adding only a small, yet distinctly targeted, set of advancements to the display hardware? This thesis describes three systems that introduce small amounts of computation to the visual display of information in order to increase the viewer's sense of immersion and participation. It also describes the types of content used to evaluate the systems, as well as the results and conclusions gained from small user studies. The first system, Infinity-by-Nine, takes advantage of the dropoff in peripheral visual acuity to surround the viewer with an extended lightfield generated in realtime from existing video content. The system analyzes an input video stream and outpaints a low-resolution, pattern-matched lightfield that simulates a fully immersive environment in a computationally efficient way. The second system, the Narratarium, is a context-aware projector that applies pattern recognition and natural language processing to an input such as an audio stream or electronic text to generate images, colors, and textures appropriate to the narrative or emotional content. The system outputs interactive illustrations and audio projected into spaces such as children's rooms, retail settings, or entertainment venues. The final system, the 3D Telepresence Chair, combines a 19th-century stage illusion known as Pepper's Ghost with an array of micro projectors and a holographic diffuser to create an autostereoscopic representation of a remote subject with full horizontal parallax. The 3D Telepresence Chair is a portable, self-contained apparatus meant to enhance the experience of teleconferencing.en_US
dc.description.statementofresponsibilityby Daniel E. Novy.en_US
dc.format.extent79 p.en_US
dc.language.isoengen_US
dc.publisherMassachusetts Institute of Technologyen_US
dc.rightsM.I.T. theses are protected by copyright. They may be viewed from this source for any purpose, but reproduction or distribution in any format is prohibited without written permission. See provided URL for inquiries about permission.en_US
dc.rights.urihttp://dspace.mit.edu/handle/1721.1/7582en_US
dc.subjectArchitecture. Program in Media Arts and Sciences.en_US
dc.titleComputational immersive displaysen_US
dc.typeThesisen_US
dc.description.degreeS.M.en_US
dc.contributor.departmentProgram in Media Arts and Sciences (Massachusetts Institute of Technology)
dc.identifier.oclc862824452en_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record