Show simple item record

dc.contributor.advisorKaraman, Sertac
dc.contributor.authorKeszler, John Alexander
dc.date.accessioned2022-02-07T15:10:38Z
dc.date.available2022-02-07T15:10:38Z
dc.date.issued2021-09
dc.date.submitted2021-09-21T19:54:14.077Z
dc.identifier.urihttps://hdl.handle.net/1721.1/139883
dc.description.abstractDepth perception is an essential component of autonomous mobile robotics platforms. Due to size and weight limitations, a monocular camera system is typically best suited to collect this data. However, using these setups for depth perception can result in poor depth mapping in areas of the frame where the camera encounters visual aberrations (fog, glare, dust). This thesis presents a large-scale dataset with a variety of scenes in both a simulated and real world indoor environments containing both visual and dynamical sensor data measured in the presence of varying levels of localized smoke sources. This dataset aims to facilitate the development of more robust autonomous UAV navigation systems. The size and variety of data presented make it a valuable tool for evaluating and testing visual-inertial estimation algorithms and haze/fog suppression methods that identify, locate and suppress these noise sources in an attempt to improve the accuracy and stability of monocular localization and mapping algorithms.
dc.publisherMassachusetts Institute of Technology
dc.rightsIn Copyright - Educational Use Permitted
dc.rightsCopyright MIT
dc.rights.urihttp://rightsstatements.org/page/InC-EDU/1.0/
dc.titleLocalized Visual Aberration Detection and Suppression Dataset for UAV Perception Systems
dc.typeThesis
dc.description.degreeS.M.
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
mit.thesis.degreeMaster
thesis.degree.nameMaster of Science in Electrical Engineering and Computer Science


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record