Depth Estimation of Non-Rigid Objects For Time-Of-Flight Imaging
Author(s)
Noraky, James; Sze, Vivienne
Download2018_icip_non_rigid_depth.pdf (5.855Mb)
OPEN_ACCESS_POLICY
Open Access Policy
Creative Commons Attribution-Noncommercial-Share Alike
Terms of use
Metadata
Show full item recordAbstract
Depth sensing is useful for a variety of applications that range from augmented reality to robotics. Time-of-flight (TOF) cameras are appealing because they obtain dense depth measurements with low latency. However, for reasons ranging from power constraints to multi-camera interference, the frequency at which accurate depth measurements can be obtained is reduced. To address this, we propose an algorithm that uses concurrently collected images to estimate the depth of non-rigid objects without using the TOF camera. Our technique models non-rigid objects as locally rigid and uses previous depth measurements along with the optical flow of the images to estimate depth. In particular, we show how we exploit the previous depth measurements to directly estimate pose and how we integrate this with our model to estimate the depth of non-rigid objects by finding the solution to a sparse linear system. We evaluate our technique on a RGB-D dataset of deformable objects, where we estimate depth with a mean relative error of 0.37% and outperform other adapted techniques.
Date issued
2018-10Department
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science; Massachusetts Institute of Technology. Microsystems Technology LaboratoriesJournal
IEEE International Conference on Image Processing (ICIP)
Publisher
Institute of Electrical and Electronics Engineers (IEEE)
Citation
Noraky, James and Vivienne Sze. "Depth Estimation of Non-Rigid Objects For Time-Of-Flight Imaging." ICIP 2018: 2925-2929.
Version: Author's final manuscript
ISBN
978-1-4673-9961-6
ISSN
2381-8549