| dc.contributor.advisor | Karaman, Sertac | |
| dc.contributor.author | Keszler, John Alexander | |
| dc.date.accessioned | 2022-02-07T15:10:38Z | |
| dc.date.available | 2022-02-07T15:10:38Z | |
| dc.date.issued | 2021-09 | |
| dc.date.submitted | 2021-09-21T19:54:14.077Z | |
| dc.identifier.uri | https://hdl.handle.net/1721.1/139883 | |
| dc.description.abstract | Depth perception is an essential component of autonomous mobile robotics platforms. Due to size and weight limitations, a monocular camera system is typically best suited to collect this data. However, using these setups for depth perception can result in poor depth mapping in areas of the frame where the camera encounters visual aberrations (fog, glare, dust). This thesis presents a large-scale dataset with a variety of scenes in both a simulated and real world indoor environments containing both visual and dynamical sensor data measured in the presence of varying levels of localized smoke sources. This dataset aims to facilitate the development of more robust autonomous UAV navigation systems. The size and variety of data presented make it a valuable tool for evaluating and testing visual-inertial estimation algorithms and haze/fog suppression methods that identify, locate and suppress these noise sources in an attempt to improve the accuracy and stability of monocular localization and mapping algorithms. | |
| dc.publisher | Massachusetts Institute of Technology | |
| dc.rights | In Copyright - Educational Use Permitted | |
| dc.rights | Copyright MIT | |
| dc.rights.uri | http://rightsstatements.org/page/InC-EDU/1.0/ | |
| dc.title | Localized Visual Aberration Detection and Suppression Dataset for UAV Perception Systems | |
| dc.type | Thesis | |
| dc.description.degree | S.M. | |
| dc.contributor.department | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science | |
| mit.thesis.degree | Master | |
| thesis.degree.name | Master of Science in Electrical Engineering and Computer Science | |