Show simple item record

dc.contributor.authorShe, Yu
dc.contributor.authorLiu, Sandra Q.
dc.contributor.authorYu, Peiyu
dc.contributor.authorAdelson, Edward
dc.date.accessioned2021-11-03T18:25:44Z
dc.date.available2021-11-03T18:25:44Z
dc.date.issued2020-05
dc.identifier.urihttps://hdl.handle.net/1721.1/137294
dc.description.abstract© 2020 IEEE. Soft robots offer significant advantages in adaptability, safety, and dexterity compared to conventional rigid-body robots. However, it is challenging to equip soft robots with accurate proprioception and tactile sensing due to their high flexibility and elasticity. In this work, we describe the development of a vision-based proprioceptive and tactile sensor for soft robots called GelFlex, which is inspired by previous GelSight sensing techniques. More specifically, we develop a novel exoskeleton-covered soft finger with embedded cameras and deep learning methods that enable high-resolution proprioceptive sensing and rich tactile sensing. To do so, we design features along the axial direction of the finger, which enable high-resolution proprioceptive sensing, and incorporate a reflective ink coating on the surface of the finger to enable rich tactile sensing. We design a highly underactuated exoskeleton with a tendon-driven mechanism to actuate the finger. Finally, we assemble 2 of the fingers together to form a robotic gripper and successfully perform a bar stock classification task, which requires both shape and tactile information. We train neural networks for proprioception and shape (box versus cylinder) classification using data from the embedded sensors. The proprioception CNN had over 99% accuracy on our testing set (all six joint angles were within 1° of error) and had an average accumulative distance error of 0.77 mm during live testing, which is better than human finger proprioception. These proposed techniques offer soft robots the high-level ability to simultaneously perceive their proprioceptive state and peripheral environment, providing potential solutions for soft robots to solve everyday manipulation tasks. We believe the methods developed in this work can be widely applied to different designs and applications.en_US
dc.language.isoen
dc.publisherIEEEen_US
dc.relation.isversionof10.1109/icra40945.2020.9197369en_US
dc.rightsCreative Commons Attribution-Noncommercial-Share Alikeen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/4.0/en_US
dc.sourcearXiven_US
dc.titleExoskeleton-covered soft finger with vision-based proprioception and tactile sensingen_US
dc.typeArticleen_US
dc.identifier.citationShe, Yu, Liu, Sandra Q., Yu, Peiyu and Adelson, Edward. 2020. "Exoskeleton-covered soft finger with vision-based proprioception and tactile sensing." Proceedings - IEEE International Conference on Robotics and Automation.
dc.contributor.departmentMassachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
dc.contributor.departmentMassachusetts Institute of Technology. Department of Brain and Cognitive Sciences
dc.relation.journalProceedings - IEEE International Conference on Robotics and Automationen_US
dc.eprint.versionAuthor's final manuscripten_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dc.date.updated2021-04-02T12:27:46Z
dspace.orderedauthorsShe, Y; Liu, SQ; Yu, P; Adelson, Een_US
dspace.date.submission2021-04-02T12:27:49Z
mit.licenseOPEN_ACCESS_POLICY
mit.metadata.statusAuthority Work and Publication Information Neededen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record