Show simple item record

dc.contributor.advisorCharles M. Oman.en_US
dc.contributor.authorHutchison, William Edward, 1960-en_US
dc.contributor.otherSystem Design and Management Program.en_US
dc.date.accessioned2012-02-28T18:47:44Z
dc.date.available2012-02-28T18:47:44Z
dc.date.copyright2000en_US
dc.date.issued2000en_US
dc.identifier.urihttp://hdl.handle.net/1721.1/69232
dc.descriptionThesis (S.M.)--Massachusetts Institute of Technology, System Design & Management Program, 2000.en_US
dc.descriptionThis electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.en_US
dc.descriptionIncludes bibliographical references (p. 84-89).en_US
dc.description.abstractThe goal of this study is to extend the desktop panoramic static image viewer concept (e.g., Apple QuickTime VR; IPIX) to support immersive real time viewing, so that an observer wearing a head-mounted display can make free head movements while viewing dynamic scenes rendered in real time stereo using video data obtained from a set of fixed cameras. Computational experiments by Seitz and others have demonstrated the feasibility of morphing image pairs to render stereo scenes from novel, virtual viewpoints. The user can interact both with morphed real world video images, and supplementary artificial virtual objects (“Augmented Reality”). The inherent congruence of the real and artificial coordinate frames of this system reduces registration errors commonly found in Augmented Reality applications. In addition, the user’s eyepoint is computed locally so that any scene lag resulting from head movement will be less than those from alternative technologies using remotely controlled ground cameras. For space applications, this can significantly reduce the apparent lag due to satellite communication delay. This hybrid VR/view-morphing display (“Virtual Video”) has many important NASA applications including remote teleoperation, crew onboard training, private family and medical teleconferencing, and telemedicine. The technical objective of this study developed a proof-of-concept system using a 3D graphics PC workstation of one of the component technologies, Immersive Omnidirectional Video, of Virtual Video. The management goal identified a system process for planning, managing, and tracking the integration, test and validation of this phased, 3-year multi-university research and development program.en_US
dc.description.statementofresponsibilityby William E. Hutchison.en_US
dc.format.extent106 p.en_US
dc.language.isoengen_US
dc.publisherMassachusetts Institute of Technologyen_US
dc.rightsM.I.T. theses are protected by copyright. They may be viewed from this source for any purpose, but reproduction or distribution in any format is prohibited without written permission. See provided URL for inquiries about permission.en_US
dc.rights.urihttp://dspace.mit.edu/handle/1721.1/7582en_US
dc.subjectSystem Design and Management Program.en_US
dc.titleThe development of a hybrid virtual reality/video view-morphing display system for teleoperation and teleconferencingen_US
dc.typeThesisen_US
dc.description.degreeS.M.en_US
dc.contributor.departmentSystem Design and Management Program.en_US
dc.identifier.oclc47918922en_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record