Tracking objects with point clouds from vision and touch
Author(s)
Izatt, Gregory R.; Mirano, Geronimo J.; Adelson, Edward H; Tedrake, Russell L
DownloadIzatt16.pdf (5.590Mb)
OPEN_ACCESS_POLICY
Open Access Policy
Creative Commons Attribution-Noncommercial-Share Alike
Terms of use
Metadata
Show full item recordAbstract
We present an object-tracking framework that fuses point cloud information from an RGB-D camera with tactile information from a GelSight contact sensor. GelSight can be treated as a source of dense local geometric information, which we incorporate directly into a conventional point-cloud-based articulated object tracker based on signed-distance functions. Our implementation runs at 12 Hz using an online depth reconstruction algorithm for GelSight and a modified second-order update for the tracking algorithm. We present data from hardware experiments demonstrating that the addition of contact-based geometric information significantly improves the pose accuracy during contact, and provides robustness to occlusions of small objects by the robot's end effector.
Date issued
2017-07Department
Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory; Massachusetts Institute of Technology. Department of Electrical Engineering and Computer ScienceJournal
2017 IEEE International Conference on Robotics and Automation (ICRA)
Publisher
Institute of Electrical and Electronics Engineers (IEEE)
Citation
Izatt, Gregory et al. “Tracking Objects with Point Clouds from Vision and Touch.” 2017 IEEE International Conference on Robotics and Automation (ICRA) May 29 - June 3 2017, Singapore, Institute of Electrical and Electronics Engineers (IEEE), July 2017 © 2017 Institute of Electrical and Electronics Engineers (IEEE)
Version: Author's final manuscript
ISBN
978-1-5090-4633-1