Simultaneous tracking and rendering: Real-time monocular localization for MAVs
Author(s)Ok, Kyel; Roy, Nicholas; Greene, William N.
MetadataShow full item record
We propose a method of real-time monocular camera-based localization in known environments. With the goal of controlling high-speed micro air vehicles (MAVs), we localize with respect to a mesh map of the environment that can support both pose estimation and trajectory planning. Using only limited hardware that can be carried on a MAV, we achieve accurate pose estimation at rates above 50 Hz, an order of magnitude faster than the current state-of-the-art meshbased localization algorithms. In our simultaneous tracking and rendering (STAR) approach, we render virtual images of the environment and track camera images with respect to them using a robust semi-direct image alignment technique. Our main contribution is the decoupling of camera tracking from virtual image rendering, which drastically reduces the number of rendered images and enables accurate full camera-rate tracking without needing a high-end GPU. We demonstrate our approach in GPS-denied indoor environments.
DepartmentMassachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory; Massachusetts Institute of Technology. Department of Aeronautics and Astronautics; Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
IEEE International Conference on Robotics and Automation, 2016
Institute of Electrical and Electronics Engineers (IEEE)
Ok, Kyel, W. Nicholas Greene, and Nicholas Roy. “Simultaneous Tracking and Rendering: Real-Time Monocular Localization for MAVs.” IEEE, 2016. 4522–4529.
Author's final manuscript