MIT Libraries logoDSpace@MIT

MIT
View Item 
  • DSpace@MIT Home
  • MIT Libraries
  • MIT Theses
  • Graduate Theses
  • View Item
  • DSpace@MIT Home
  • MIT Libraries
  • MIT Theses
  • Graduate Theses
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Vision-based proprioception of a soft robotic finger with tactile sensing

Author(s)
Liu, Sandra Q.
Thumbnail
Download1191837351-MIT.pdf (6.938Mb)
Other Contributors
Massachusetts Institute of Technology. Department of Mechanical Engineering.
Advisor
Edward H. Adelson.
Terms of use
MIT theses may be protected by copyright. Please reuse MIT thesis content according to the MIT Libraries Permissions Policy, which is available through the URL provided. http://dspace.mit.edu/handle/1721.1/7582
Metadata
Show full item record
Abstract
Over the past decade, the development of soft robots has significantly progressed. Today, soft robots have a variety of usages in multiple fields, ranging from surgical robotics to prostheses to human-robot interaction. These robots are more versatile, adaptable, safe, robust, and dexterous than their conventional rigid-body counterparts. However, due to their high-dimensionality and flexibility, they still lack a quintessential human ability: the ability to accurately perceive themselves and the environment around them. To maximize their effectiveness, soft robots should be equipped with both proprioception and exteroception that can capture this intricate high-dimensionality. In this thesis, an embedded vision-based sensor, which can capture richly detailed information, is utilized to concurrently perceive proprioception and tactile sensing. Three proprioceptive methods are implemented: dot pose tracking, lookup table, and deep learning.
 
Although dot pose tracking (average 0.54 mm RMSE) and the lookup table (0.91 mm accumulative distance error) both have accurate proprioception results, they are impractical to implement and easily influenced by outside parameters. As such, the deep learning method for soft finger proprioception was implemented for the GelFlex, a novel highly underactuated exoskeleton-covered soft finger with embedded cameras. The GelFlex has the ability to perform both proprioception and tactile sensing and upon assembly into a two-finger robotic gripper, was able to successfully perform a bar stock classification task, which requires both types of sensing. The proprioception CNN was extremely accurate on the testing set (99% accuracy where all angles were within 1° error) and had an average accumulative distance error of 0.77 mm during live testing, which is better than human finger proprioception (8.0 cm ±1.0 cm error).
 
Overall, these techniques allow soft robots to be able to perceive their own shape and the surrounding environment, enabling them to potentially solve various everyday manipulation tasks.
 
Description
Thesis: S.M., Massachusetts Institute of Technology, Department of Mechanical Engineering, May, 2020
 
Cataloged from the official PDF of thesis.
 
Includes bibliographical references (pages 69-72).
 
Date issued
2020
URI
https://hdl.handle.net/1721.1/127131
Department
Massachusetts Institute of Technology. Department of Mechanical Engineering
Publisher
Massachusetts Institute of Technology
Keywords
Mechanical Engineering.

Collections
  • Graduate Theses

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

Login

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries
PrivacyPermissionsAccessibilityContact us
MIT
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.