Accurate Vision-based Manipulation through Contact Reasoning
Author(s)
Kloss, Alina; Bauza Villalonga, Maria; Wu, Jiajun; Tenenbaum, Joshua B; Rodriguez Garcia, Alberto; Bohg, Jeannette; ... Show more Show less
DownloadAccepted version (1.900Mb)
Open Access Policy
Open Access Policy
Creative Commons Attribution-Noncommercial-Share Alike
Terms of use
Metadata
Show full item recordAbstract
© 2020 IEEE. Planning contact interactions is one of the core challenges of many robotic tasks. Optimizing contact locations while taking dynamics into account is computationally costly and, in environments that are only partially observable, executing contact-based tasks often suffers from low accuracy. We present an approach that addresses these two challenges for the problem of vision-based manipulation. First, we propose to disentangle contact from motion optimization. Thereby, we improve planning efficiency by focusing computation on promising contact locations. Second, we use a hybrid approach for perception and state estimation that combines neural networks with a physically meaningful state representation. In simulation and real-world experiments on the task of planar pushing, we show that our method is more efficient and achieves a higher manipulation accuracy than previous vision-based approaches.
Date issued
2020-04Department
Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory; MIT-IBM Watson AI Lab; Massachusetts Institute of Technology. Department of Mechanical EngineeringJournal
Proceedings - IEEE International Conference on Robotics and Automation
Publisher
Institute of Electrical and Electronics Engineers (IEEE)
Citation
Kloss, Alina, Bauza, Maria, Wu, Jiajun, Tenenbaum, Joshua B, Rodriguez, Alberto et al. 2020. "Accurate Vision-based Manipulation through Contact Reasoning." Proceedings - IEEE International Conference on Robotics and Automation.
Version: Author's final manuscript
ISSN
1050-4729