Show simple item record

dc.contributor.authorYuan, Wenzhen
dc.contributor.authorMo, Yuchen
dc.contributor.authorWang, Shaoxiong
dc.contributor.authorAdelson, Edward H
dc.date.accessioned2020-08-19T20:57:44Z
dc.date.available2020-08-19T20:57:44Z
dc.date.issued2018-09
dc.date.submitted2018-05
dc.identifier.isbn9781538630815
dc.identifier.issn2577-087X
dc.identifier.urihttps://hdl.handle.net/1721.1/126688
dc.description.abstractHumans represent and discriminate the objects in the same category using their properties, and an intelligent robot should be able to do the same. In this paper, we build a robot system that can autonomously perceive the object properties through touch. We work on the common object category of clothing. The robot moves under the guidance of an external Kinect sensor, and squeezes the clothes with a GelSight tactile sensor, then it recognizes the 11 properties of the clothing according to the tactile data. Those properties include the physical properties, like thickness, fuzziness, softness and durability, and semantic properties, like wearing season and preferred washing methods. We collect a dataset of 153 varied pieces of clothes, and conduct 6616 robot exploring iterations on them. To extract the useful information from the high-dimensional sensory output, we applied Convolutional Neural Networks (CNN) on the tactile data for recognizing the clothing properties, and on the Kinect depth images for selecting exploration locations. Experiments show that using the trained neural networks, the robot can autonomously explore the unknown clothes and learn their properties. This work proposes a new framework for active tactile perception system with vision-touch system, and has potential to enable robots to help humans with varied clothing related housework.en_US
dc.language.isoen
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)en_US
dc.relation.isversionofhttp://dx.doi.org/10.1109/icra.2018.8461164en_US
dc.rightsCreative Commons Attribution-Noncommercial-Share Alikeen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/4.0/en_US
dc.sourcearXiven_US
dc.titleActive Clothing Material Perception Using Tactile Sensing and Deep Learningen_US
dc.typeArticleen_US
dc.identifier.citationYuan, Wenzhen et al. "Active Clothing Material Perception Using Tactile Sensing and Deep Learning." IEEE International Conference on Robotics and Automation (ICRA), May 2018, Brisbane, Australia, Institute of Electrical and Electronics Engineers, September 2018. © 2018 IEEEen_US
dc.contributor.departmentMassachusetts Institute of Technology. Laboratory for Computer Scienceen_US
dc.relation.journalIEEE International Conference on Robotics and Automation (ICRA)en_US
dc.eprint.versionOriginal manuscripten_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dc.date.updated2019-09-27T17:05:01Z
dspace.date.submission2019-09-27T17:05:07Z
mit.metadata.statusComplete


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record