Show simple item record

dc.contributor.advisorPaul DeBitetto and Derek Rowell.en_US
dc.contributor.authorDiel, David D., 1979-en_US
dc.contributor.otherMassachusetts Institute of Technology. Dept. of Mechanical Engineering.en_US
dc.date.accessioned2006-03-24T18:39:32Z
dc.date.available2006-03-24T18:39:32Z
dc.date.copyright2005en_US
dc.date.issued2005en_US
dc.identifier.urihttp://hdl.handle.net/1721.1/30317
dc.descriptionThesis (S.M.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2005.en_US
dc.descriptionIncludes bibliographical references (p. 107-110).en_US
dc.description.abstractThis thesis describes a new method to improve inertial navigation using feature-based constraints from one or more video cameras. The proposed method lengthens the period of time during which a human or vehicle can navigate in GPS-deprived environments. Our approach integrates well with existing navigation systems, because we invoke general sensor models that represent a wide range of available hardware. The inertial model includes errors in bias, scale, and random walk. Any camera and tracking algorithm may be used, as long as the visual output can be expressed as ray vectors extending from known locations on the sensor body. A modified linear Kalman filter performs the data fusion. Unlike traditional Simultaneous Localization and Mapping (SLAM/CML), our state vector contains only inertial sensor errors related to position. This choice allows uncertainty to be properly represented by a covariance matrix. We do not augment the state with feature coordinates. Instead, image data contributes stochastic epipolar constraints over a broad baseline in time and space, resulting in improved observability of the IMU error states. The constraints lead to a relative residual and associated relative covariance, defined partly by the state history. Navigation results are presented using high-quality synthetic data and real fisheye imagery.en_US
dc.description.statementofresponsibilityby David D. Diel.en_US
dc.format.extent110 p.en_US
dc.format.extent4981248 bytes
dc.format.extent4994478 bytes
dc.format.mimetypeapplication/pdf
dc.format.mimetypeapplication/pdf
dc.language.isoengen_US
dc.publisherMassachusetts Institute of Technologyen_US
dc.rightsM.I.T. theses are protected by copyright. They may be viewed from this source for any purpose, but reproduction or distribution in any format is prohibited without written permission. See provided URL for inquiries about permission.en_US
dc.rights.urihttp://dspace.mit.edu/handle/1721.1/7582
dc.subjectMechanical Engineering.en_US
dc.titleStochastic constraints for vision-aided inertial navigationen_US
dc.typeThesisen_US
dc.description.degreeS.M.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Mechanical Engineering
dc.identifier.oclc61103048en_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record