Efficient POMDP Forward Search by Predicting the Posterior Belief Distribution
Author(s)
Roy, Nicholas; He, Ruijie![Thumbnail](/bitstream/handle/1721.1/46820/MIT-CSAIL-TR-2009-044.pdf.jpg?sequence=6&isAllowed=y)
DownloadMIT-CSAIL-TR-2009-044.pdf (321.8Kb)
Additional downloads
Other Contributors
Robotics, Vision & Sensor Networks
Advisor
Nicholas Roy
Terms of use
Metadata
Show full item recordAbstract
Online, forward-search techniques have demonstrated promising results for solving problems in partially observable environments. These techniques depend on the ability to efficiently search and evaluate the set of beliefs reachable from the current belief. However, enumerating or sampling action-observation sequences to compute the reachable beliefs is computationally demanding; coupled with the need to satisfy real-time constraints, existing online solvers can only search to a limited depth. In this paper, we propose that policies can be generated directly from the distribution of the agent's posterior belief. When the underlying state distribution is Gaussian, and the observation function is an exponential family distribution, we can calculate this distribution of beliefs without enumerating the possible observations. This property not only enables us to plan in problems with large observation spaces, but also allows us to search deeper by considering policies composed of multi-step action sequences. We present the Posterior Belief Distribution (PBD) algorithm, an efficient forward-search POMDP planner for continuous domains, demonstrating that better policies are generated when we can perform deeper forward search.
Date issued
2009-09-23Series/Report no.
MIT-CSAIL-TR-2009-044
Collections
The following license files are associated with this item: