Localized Visual Aberration Detection and Suppression Dataset for UAV Perception Systems
Author(s)
Keszler, John Alexander
DownloadThesis PDF (987.5Kb)
Advisor
Karaman, Sertac
Terms of use
Metadata
Show full item recordAbstract
Depth perception is an essential component of autonomous mobile robotics platforms. Due to size and weight limitations, a monocular camera system is typically best suited to collect this data. However, using these setups for depth perception can result in poor depth mapping in areas of the frame where the camera encounters visual aberrations (fog, glare, dust). This thesis presents a large-scale dataset with a variety of scenes in both a simulated and real world indoor environments containing both visual and dynamical sensor data measured in the presence of varying levels of localized smoke sources. This dataset aims to facilitate the development of more robust autonomous UAV navigation systems. The size and variety of data presented make it a valuable tool for evaluating and testing visual-inertial estimation algorithms and haze/fog suppression methods that identify, locate and suppress these noise sources in an attempt to improve the accuracy and stability of monocular localization and mapping algorithms.
Date issued
2021-09Department
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer SciencePublisher
Massachusetts Institute of Technology