Show simple item record

dc.contributor.authorBauza, Maria
dc.contributor.authorBronars, Antonia
dc.contributor.authorRodriguez, Alberto
dc.date.accessioned2024-06-12T21:46:50Z
dc.date.available2024-06-12T21:46:50Z
dc.date.issued2023-09-11
dc.identifier.issn0278-3649
dc.identifier.issn1741-3176
dc.identifier.urihttps://hdl.handle.net/1721.1/155264
dc.description.abstractIn this paper, we present Tac2Pose, an object-specific approach to tactile pose estimation from the first touch for known objects. Given the object geometry, we learn a tailored perception model in simulation that estimates a probability distribution over possible object poses given a tactile observation. To do so, we simulate the contact shapes that a dense set of object poses would produce on the sensor. Then, given a new contact shape obtained from the sensor, we match it against the pre-computed set using an object-specific embedding learned using contrastive learning. We obtain contact shapes from the sensor with an object-agnostic calibration step that maps RGB (red, green, blue) tactile observations to binary contact shapes. This mapping, which can be reused across object and sensor instances, is the only step trained with real sensor data. This results in a perception model that localizes objects from the first real tactile observation. Importantly, it produces pose distributions and can incorporate additional pose constraints coming from other perception systems, multiple contacts, or priors. We provide quantitative results for 20 objects. Tac2Pose provides high accuracy pose estimations from distinctive tactile observations while regressing meaningful pose distributions to account for those contact shapes that could result from different object poses. We extend and test Tac2Pose in multi-contact scenarios where two tactile sensors are simultaneously in contact with the object, as during a grasp with a parallel jaw gripper. We further show that when the output pose distribution is filtered with a prior on the object pose, Tac2Pose is often able to improve significantly on the prior. This suggests synergistic use of Tac2Pose with additional sensing modalities (e.g., vision) even in cases where the tactile observation from a grasp is not sufficiently discriminative. Given a coarse estimate of an object’s pose, even ambiguous contacts can be used to determine an object’s pose precisely. We also test Tac2Pose on object models reconstructed from a 3D scanner, to evaluate the robustness to uncertainty in the object model. We show that even in the presence of model uncertainty, Tac2Pose is able to achieve fine accuracy comparable to when the object model is the manufacturer’s CAD (computer-aided design) model. Finally, we demonstrate the advantages of Tac2Pose compared with three baseline methods for tactile pose estimation: directly regressing the object pose with a neural network, matching an observed contact to a set of possible contacts using a standard classification neural network, and direct pixel comparison of an observed contact with a set of possible contacts. Website: mcube.mit.edu/research/tac2pose.htmlen_US
dc.language.isoen
dc.publisherSAGE Publicationsen_US
dc.relation.isversionof10.1177/02783649231196925en_US
dc.rightsCreative Commons Attribution-Noncommercialen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc/4.0/en_US
dc.sourceSAGE Publicationsen_US
dc.titleTac2Pose: Tactile object pose estimation from the first touchen_US
dc.typeArticleen_US
dc.identifier.citationBauza M, Bronars A, Rodriguez A. Tac2Pose: Tactile object pose estimation from the first touch. The International Journal of Robotics Research. 2023;42(13):1185-1209.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Mechanical Engineering
dc.relation.journalThe International Journal of Robotics Researchen_US
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dc.date.updated2024-06-12T21:39:10Z
dspace.orderedauthorsBauza, M; Bronars, A; Rodriguez, Aen_US
dspace.date.submission2024-06-12T21:39:15Z
mit.journal.volume42en_US
mit.journal.issue13en_US
mit.licensePUBLISHER_CC
mit.metadata.statusAuthority Work and Publication Information Neededen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record