Notice
This is not the latest version of this item. The latest version can be found at:https://dspace.mit.edu/handle/1721.1/141381.2
Memory-Efficient Gaussian Fitting for Depth Images in Real Time
Author(s)
Karaman, Sertac; Sze, Vivienne; Li, Peter Zhi Xuan
Download2022_icra_spgf.pdf (2.970Mb)
Open Access Policy
Open Access Policy
Creative Commons Attribution-Noncommercial-Share Alike
Terms of use
Metadata
Show full item recordAbstract
Computing consumes a significant portion of energy in many robotics applications, especially the ones involving
energy-constrained robots. In addition, memory access accounts
for a significant portion of the computing energy. For mapping
a 3D environment, prior approaches reduce the map size
while incurring a large memory overhead used for storing
sensor measurements and temporary variables during computation. In this work, we present a memory-efficient algorithm,
named Single-Pass Gaussian Fitting (SPGF), that accurately
constructs a compact Gaussian Mixture Model (GMM) which
approximates measurements from a depthmap generated from
a depth camera. By incrementally constructing the GMM
one pixel at a time in a single pass through the depthmap,
SPGF achieves higher throughput and orders-of-magnitude
lower memory overhead than prior multi-pass approaches. By
processing the depthmap row-by-row, SPGF exploits intrinsic
properties of the camera to efficiently and accurately infer
surface geometries, which leads to higher precision than prior
approaches while maintaining the same compactness of the
GMM. Using a low-power ARM Cortex-A57 CPU on the
NVIDIA Jetson TX2 platform, SPGF operates at 32fps, requires
43KB of memory overhead, and consumes only 0.11J per frame
(depthmap). Thus, SPGF enables real-time mapping of large 3D
environments on energy-constrained robots.
Date issued
2022-05-23Journal
IEEE International Conference on Robotics and Automation (ICRA)
Citation
Karaman, Sertac, Sze, Vivienne and Li, Peter Zhi Xuan. 2022. "Memory-Efficient Gaussian Fitting for Depth Images in Real Time." IEEE International Conference on Robotics and Automation (ICRA).
Version: Author's final manuscript