dc.contributor.advisor | Edward H. Adelson. | en_US |
dc.contributor.author | Liu, Sandra Q. | en_US |
dc.contributor.other | Massachusetts Institute of Technology. Department of Mechanical Engineering. | en_US |
dc.date.accessioned | 2020-09-03T17:48:27Z | |
dc.date.available | 2020-09-03T17:48:27Z | |
dc.date.copyright | 2020 | en_US |
dc.date.issued | 2020 | en_US |
dc.identifier.uri | https://hdl.handle.net/1721.1/127131 | |
dc.description | Thesis: S.M., Massachusetts Institute of Technology, Department of Mechanical Engineering, May, 2020 | en_US |
dc.description | Cataloged from the official PDF of thesis. | en_US |
dc.description | Includes bibliographical references (pages 69-72). | en_US |
dc.description.abstract | Over the past decade, the development of soft robots has significantly progressed. Today, soft robots have a variety of usages in multiple fields, ranging from surgical robotics to prostheses to human-robot interaction. These robots are more versatile, adaptable, safe, robust, and dexterous than their conventional rigid-body counterparts. However, due to their high-dimensionality and flexibility, they still lack a quintessential human ability: the ability to accurately perceive themselves and the environment around them. To maximize their effectiveness, soft robots should be equipped with both proprioception and exteroception that can capture this intricate high-dimensionality. In this thesis, an embedded vision-based sensor, which can capture richly detailed information, is utilized to concurrently perceive proprioception and tactile sensing. Three proprioceptive methods are implemented: dot pose tracking, lookup table, and deep learning. | en_US |
dc.description.abstract | Although dot pose tracking (average 0.54 mm RMSE) and the lookup table (0.91 mm accumulative distance error) both have accurate proprioception results, they are impractical to implement and easily influenced by outside parameters. As such, the deep learning method for soft finger proprioception was implemented for the GelFlex, a novel highly underactuated exoskeleton-covered soft finger with embedded cameras. The GelFlex has the ability to perform both proprioception and tactile sensing and upon assembly into a two-finger robotic gripper, was able to successfully perform a bar stock classification task, which requires both types of sensing. The proprioception CNN was extremely accurate on the testing set (99% accuracy where all angles were within 1° error) and had an average accumulative distance error of 0.77 mm during live testing, which is better than human finger proprioception (8.0 cm ±1.0 cm error). | en_US |
dc.description.abstract | Overall, these techniques allow soft robots to be able to perceive their own shape and the surrounding environment, enabling them to potentially solve various everyday manipulation tasks. | en_US |
dc.description.statementofresponsibility | by Sandra Q. Liu. | en_US |
dc.format.extent | 72 pages | en_US |
dc.language.iso | eng | en_US |
dc.publisher | Massachusetts Institute of Technology | en_US |
dc.rights | MIT theses may be protected by copyright. Please reuse MIT thesis content according to the MIT Libraries Permissions Policy, which is available through the URL provided. | en_US |
dc.rights.uri | http://dspace.mit.edu/handle/1721.1/7582 | en_US |
dc.subject | Mechanical Engineering. | en_US |
dc.title | Vision-based proprioception of a soft robotic finger with tactile sensing | en_US |
dc.type | Thesis | en_US |
dc.description.degree | S.M. | en_US |
dc.contributor.department | Massachusetts Institute of Technology. Department of Mechanical Engineering | en_US |
dc.identifier.oclc | 1191837351 | en_US |
dc.description.collection | S.M. Massachusetts Institute of Technology, Department of Mechanical Engineering | en_US |
dspace.imported | 2020-09-03T17:48:26Z | en_US |
mit.thesis.degree | Master | en_US |
mit.thesis.department | MechE | en_US |