Show simple item record

dc.contributor.advisorEdward H. Adelson.en_US
dc.contributor.authorLiu, Sandra Q.en_US
dc.contributor.otherMassachusetts Institute of Technology. Department of Mechanical Engineering.en_US
dc.date.accessioned2020-09-03T17:48:27Z
dc.date.available2020-09-03T17:48:27Z
dc.date.copyright2020en_US
dc.date.issued2020en_US
dc.identifier.urihttps://hdl.handle.net/1721.1/127131
dc.descriptionThesis: S.M., Massachusetts Institute of Technology, Department of Mechanical Engineering, May, 2020en_US
dc.descriptionCataloged from the official PDF of thesis.en_US
dc.descriptionIncludes bibliographical references (pages 69-72).en_US
dc.description.abstractOver the past decade, the development of soft robots has significantly progressed. Today, soft robots have a variety of usages in multiple fields, ranging from surgical robotics to prostheses to human-robot interaction. These robots are more versatile, adaptable, safe, robust, and dexterous than their conventional rigid-body counterparts. However, due to their high-dimensionality and flexibility, they still lack a quintessential human ability: the ability to accurately perceive themselves and the environment around them. To maximize their effectiveness, soft robots should be equipped with both proprioception and exteroception that can capture this intricate high-dimensionality. In this thesis, an embedded vision-based sensor, which can capture richly detailed information, is utilized to concurrently perceive proprioception and tactile sensing. Three proprioceptive methods are implemented: dot pose tracking, lookup table, and deep learning.en_US
dc.description.abstractAlthough dot pose tracking (average 0.54 mm RMSE) and the lookup table (0.91 mm accumulative distance error) both have accurate proprioception results, they are impractical to implement and easily influenced by outside parameters. As such, the deep learning method for soft finger proprioception was implemented for the GelFlex, a novel highly underactuated exoskeleton-covered soft finger with embedded cameras. The GelFlex has the ability to perform both proprioception and tactile sensing and upon assembly into a two-finger robotic gripper, was able to successfully perform a bar stock classification task, which requires both types of sensing. The proprioception CNN was extremely accurate on the testing set (99% accuracy where all angles were within 1° error) and had an average accumulative distance error of 0.77 mm during live testing, which is better than human finger proprioception (8.0 cm ±1.0 cm error).en_US
dc.description.abstractOverall, these techniques allow soft robots to be able to perceive their own shape and the surrounding environment, enabling them to potentially solve various everyday manipulation tasks.en_US
dc.description.statementofresponsibilityby Sandra Q. Liu.en_US
dc.format.extent72 pagesen_US
dc.language.isoengen_US
dc.publisherMassachusetts Institute of Technologyen_US
dc.rightsMIT theses may be protected by copyright. Please reuse MIT thesis content according to the MIT Libraries Permissions Policy, which is available through the URL provided.en_US
dc.rights.urihttp://dspace.mit.edu/handle/1721.1/7582en_US
dc.subjectMechanical Engineering.en_US
dc.titleVision-based proprioception of a soft robotic finger with tactile sensingen_US
dc.typeThesisen_US
dc.description.degreeS.M.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Mechanical Engineeringen_US
dc.identifier.oclc1191837351en_US
dc.description.collectionS.M. Massachusetts Institute of Technology, Department of Mechanical Engineeringen_US
dspace.imported2020-09-03T17:48:26Zen_US
mit.thesis.degreeMasteren_US
mit.thesis.departmentMechEen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record