Simultaneous tracking and rendering: Real-time monocular localization for MAVs
Author(s)
Ok, Kyel; Roy, Nicholas; Greene, William N.
DownloadRoy_Simultaneous tracking.pdf (4.359Mb)
OPEN_ACCESS_POLICY
Open Access Policy
Creative Commons Attribution-Noncommercial-Share Alike
Terms of use
Metadata
Show full item recordAbstract
We propose a method of real-time monocular camera-based localization in known environments. With the goal of controlling high-speed micro air vehicles (MAVs), we localize with respect to a mesh map of the environment that can support both pose estimation and trajectory planning. Using
only limited hardware that can be carried on a MAV, we achieve accurate pose estimation at rates above 50 Hz, an order of magnitude faster than the current state-of-the-art meshbased localization algorithms. In our simultaneous tracking and rendering (STAR) approach, we render virtual images of the environment and track camera images with respect to
them using a robust semi-direct image alignment technique. Our main contribution is the decoupling of camera tracking from virtual image rendering, which drastically reduces the number of rendered images and enables accurate full camera-rate tracking without needing a high-end GPU. We demonstrate our approach in GPS-denied indoor environments.
Date issued
2016-05Department
Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory; Massachusetts Institute of Technology. Department of Aeronautics and Astronautics; Massachusetts Institute of Technology. Department of Electrical Engineering and Computer ScienceJournal
IEEE International Conference on Robotics and Automation, 2016
Publisher
Institute of Electrical and Electronics Engineers (IEEE)
Citation
Ok, Kyel, W. Nicholas Greene, and Nicholas Roy. “Simultaneous Tracking and Rendering: Real-Time Monocular Localization for MAVs.” IEEE, 2016. 4522–4529.
Version: Author's final manuscript
ISBN
978-1-4673-8026-3