Notice
This is not the latest version of this item. The latest version can be found at:https://dspace.mit.edu/handle/1721.1/138353.3
Accurate Vision-based Manipulation through Contact Reasoning
Author(s)
Kloss, Alina; Bauza, Maria; Wu, Jiajun; Tenenbaum, Joshua B; Rodriguez, Alberto; Bohg, Jeannette; ... Show more Show less
DownloadAccepted version (1.900Mb)
Open Access Policy
Open Access Policy
Creative Commons Attribution-Noncommercial-Share Alike
Terms of use
Metadata
Show full item recordAbstract
© 2020 IEEE. Planning contact interactions is one of the core challenges of many robotic tasks. Optimizing contact locations while taking dynamics into account is computationally costly and, in environments that are only partially observable, executing contact-based tasks often suffers from low accuracy. We present an approach that addresses these two challenges for the problem of vision-based manipulation. First, we propose to disentangle contact from motion optimization. Thereby, we improve planning efficiency by focusing computation on promising contact locations. Second, we use a hybrid approach for perception and state estimation that combines neural networks with a physically meaningful state representation. In simulation and real-world experiments on the task of planar pushing, we show that our method is more efficient and achieves a higher manipulation accuracy than previous vision-based approaches.
Date issued
2020Journal
Proceedings - IEEE International Conference on Robotics and Automation
Publisher
Institute of Electrical and Electronics Engineers (IEEE)
Citation
Kloss, Alina, Bauza, Maria, Wu, Jiajun, Tenenbaum, Joshua B, Rodriguez, Alberto et al. 2020. "Accurate Vision-based Manipulation through Contact Reasoning." Proceedings - IEEE International Conference on Robotics and Automation.
Version: Author's final manuscript