Show simple item record

dc.contributor.advisorAgrawal, Pulkit
dc.contributor.authorMargolis, Gabriel B.
dc.date.accessioned2022-01-14T15:04:11Z
dc.date.available2022-01-14T15:04:11Z
dc.date.issued2021-06
dc.date.submitted2021-06-17T20:13:45.036Z
dc.identifier.urihttps://hdl.handle.net/1721.1/139325
dc.description.abstractToday’s robotic quadruped systems can walk over a diverse set of natural and complex terrains. Approaches to locomotion based on model-based feedback control are robust to perturbations but cannot easily incorporate visual terrain information. Meanwhile, approaches to locomotion based on learning excel at associating visual sensory data with suitable control policies but often fail to generalize across the gap between simulation and deployment settings. This thesis proposes a trajectory-based abstraction for locomotion through which model-free and model-based control layers interface. This approach enables general visually guided locomotion while preserving robustness. We demonstrate that our proposed architecture allows the Mini Cheetah quadruped to match theoretical performance limits in a set of visual tasks. The robustness and practicality afforded by our approach are demonstrated through evaluation on hardware.
dc.publisherMassachusetts Institute of Technology
dc.rightsIn Copyright - Educational Use Permitted
dc.rightsCopyright MIT
dc.rights.urihttp://rightsstatements.org/page/InC-EDU/1.0/
dc.titleLearning Robust Terrain-Aware Locomotion
dc.typeThesis
dc.description.degreeM.Eng.
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
mit.thesis.degreeMaster
thesis.degree.nameMaster of Engineering in Electrical Engineering and Computer Science


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record