6.231 Dynamic Programming and Stochastic Control, Fall 2011
Author(s)
Bertsekas, Dimitri
Download6-231-fall-2011/contents/index.htm (33.98Kb)
Alternative title
Dynamic Programming and Stochastic Control
Metadata
Show full item recordAbstract
The course covers the basic models and solution techniques for problems of sequential decision making under uncertainty (stochastic control). We will consider optimal control of a dynamical system over both a finite and an infinite number of stages. This includes systems with finite or infinite state spaces, as well as perfectly or imperfectly observed systems. We will also discuss approximation methods for problems involving large state spaces. Applications of dynamic programming in a variety of fields will be covered in recitations.
Date issued
2011-12Other identifiers
6.231-Fall2011
local: 6.231
local: IMSCP-MD5-790c6f8f173f8a939a6f849836a249c6
Keywords
dynamic programming, stochastic control, algorithms, finite-state, continuous-time, imperfect state information, suboptimal control, finite horizon, infinite horizon, discounted problems, stochastic shortest path, approximate dynamic programming