DensePhysNet: Learning Dense Physical Object Representations Via Multi-Step Dynamic Interactions
Author(s)
Xu, Zhenjia; Wu, Jiajun; Zeng, Andy; Tenenbaum, Joshua; Song, Shuran
DownloadAccepted version (4.929Mb)
Open Access Policy
Open Access Policy
Creative Commons Attribution-Noncommercial-Share Alike
Terms of use
Metadata
Show full item recordAbstract
We study the problem of learning physical object
representations for robot manipulation. Understanding object
physics is critical for successful object manipulation, but also
challenging because physical object properties can rarely be
inferred from the object’s static appearance. In this paper, we
propose DensePhysNet, a system that actively executes a sequence
of dynamic interactions (e.g., sliding and colliding), and uses a
deep predictive model over its visual observations to learn dense,
pixel-wise representations that reflect the physical properties of
observed objects. Our experiments in both simulation and real
settings demonstrate that the learned representations carry rich
physical information, and can directly be used to decode physical
object properties such as friction and mass. The use of dense
representation enables DensePhysNet to generalize well to novel
scenes with more objects than in training. With knowledge of
object physics, the learned representation also leads to more
accurate and efficient manipulation in downstream tasks than
the state-of-the-art.
Date issued
2019Department
Center for Brains, Minds, and MachinesJournal
Robotics: Science and Systems XV
Publisher
Robotics: Science and Systems Foundation
Citation
Xu, Zhenjia, Wu, Jiajun, Zeng, Andy, Tenenbaum, Joshua and Song, Shuran. 2019. "DensePhysNet: Learning Dense Physical Object Representations Via Multi-Step Dynamic Interactions." Robotics: Science and Systems XV.
Version: Author's final manuscript