Show simple item record

dc.contributor.authorYuan, Wenzhen
dc.contributor.authorSrinivasan, Mandayam A
dc.contributor.authorAdelson, Edward H
dc.date.accessioned2017-10-27T16:10:11Z
dc.date.available2017-10-27T16:10:11Z
dc.date.issued2016-12
dc.date.submitted2016-10
dc.identifier.isbn978-1-5090-3762-9
dc.identifier.issn2153-0866
dc.identifier.urihttp://hdl.handle.net/1721.1/111989
dc.description.abstractHardness sensing is a valuable capability for a robot touch sensor. We describe a novel method of hardness sensing that does not require accurate control of contact conditions. A GelSight sensor is a tactile sensor that provides high resolution tactile images, which enables a robot to infer object properties such as geometry and fine texture, as well as contact force and slip conditions. The sensor is pressed on silicone samples by a human or a robot and we measure the sample hardness only with data from the sensor, without a separate force sensor and without precise knowledge of the contact trajectory. We describe the features that show object hardness. For hemispherical objects, we develop a model to measure the sample hardness, and the estimation error is about 4% in the range of 8 Shore 00 to 45 Shore A. With this technology, a robot is able to more easily infer the hardness of the touched objects, thereby improving its object recognition as well as manipulation strategy.en_US
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)en_US
dc.relation.isversionofhttp://dx.doi.org/10.1109/IROS.2016.7759057en_US
dc.rightsCreative Commons Attribution-Noncommercial-Share Alikeen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/4.0/en_US
dc.sourceMIT Web Domainen_US
dc.titleEstimating object hardness with a GelSight touch sensoren_US
dc.typeArticleen_US
dc.identifier.citationYuan, Wenzhen et al. “Estimating Object Hardness with a GelSight Touch Sensor.” 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), October 9-14 2016, Daejeon, South Korea, Institute of Electrical and Electronics Engineers (IEEE), December 2016 © 2016 Institute of Electrical and Electronics Engineers (IEEE)en_US
dc.contributor.departmentMassachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratoryen_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Brain and Cognitive Sciencesen_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Scienceen_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Mechanical Engineeringen_US
dc.contributor.departmentMassachusetts Institute of Technology. Laboratory for Human and Machine Hapticsen_US
dc.contributor.mitauthorYuan, Wenzhen
dc.contributor.mitauthorSrinivasan, Mandayam A
dc.contributor.mitauthorAdelson, Edward H
dc.relation.journal2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)en_US
dc.eprint.versionAuthor's final manuscripten_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dc.date.updated2017-10-25T16:46:03Z
dspace.orderedauthorsYuan, Wenzhen; Srinivasan, Mandayam A.; Adelson, Edward H.en_US
dspace.embargo.termsNen_US
dc.identifier.orcidhttps://orcid.org/0000-0001-8014-356X
dc.identifier.orcidhttps://orcid.org/0000-0003-1347-6502
dc.identifier.orcidhttps://orcid.org/0000-0003-2222-6775
mit.licenseOPEN_ACCESS_POLICYen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record