Show simple item record

dc.contributor.advisorAgrawal, Pulkit
dc.contributor.authorChen, Eric
dc.date.accessioned2022-02-07T15:29:34Z
dc.date.available2022-02-07T15:29:34Z
dc.date.issued2021-09
dc.date.submitted2021-11-03T19:25:30.687Z
dc.identifier.urihttps://hdl.handle.net/1721.1/140193
dc.description.abstractIntrinsic reward-based exploration methods have successfully solved challenging sparse reward tasks such as Montezuma’s Revenge. However, these methods have not been widely adopted in reinforcement learning due to inconsistent performance gains across tasks. To better understand the underlying cause of this variability, we evaluate the performance of three major families of exploration methods on a suite of custom environments and video games: prediction error, state visitation and model uncertainty. Our custom environments allow us to study the effect of different environmental features in isolation. Our results reveal that exploration methods can be biased by spurious features such as color, and prioritize different dynamics in specific environments. In particular, we find that prediction-based methods are superior at solving tasks involving controllable dynamics. Furthermore, we find that partial observability can hinder exploration by setting up "curiosity traps" that agents can fall into. Finally, we investigate how various implementation details such as reward design and generation affect an agent’s overall performance.
dc.publisherMassachusetts Institute of Technology
dc.rightsIn Copyright - Educational Use Permitted
dc.rightsCopyright MIT
dc.rights.urihttp://rightsstatements.org/page/InC-EDU/1.0/
dc.titleUnderstanding Bonus-Based Exploration in Reinforcement Learning
dc.typeThesis
dc.description.degreeM.Eng.
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
mit.thesis.degreeMaster
thesis.degree.nameMaster of Engineering in Electrical Engineering and Computer Science


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record