6.231 Dynamic Programming and Stochastic Control, Fall 2008
Author(s)
Bertsekas, Dimitri
Download6-231-fall-2008/contents/index.htm (28.29Kb)
Alternative title
Dynamic Programming and Stochastic Control
Metadata
Show full item recordAbstract
This course covers the basic models and solution techniques for problems of sequential decision making under uncertainty (stochastic control). We will consider optimal control of a dynamical system over both a finite and an infinite number of stages (finite and infinite horizon). We will also discuss some approximation methods for problems involving large state spaces. Applications of dynamic programming in a variety of fields will be covered in recitations.
Date issued
2008-12Other identifiers
6.231-Fall2008
local: 6.231
local: IMSCP-MD5-5c25e9035021832542e5f35f56b312cc
Keywords
dynamic programming, stochastic control, decision making, uncertainty, sequential decision making, finite horizon, infinite horizon, approximation methods, state space, large state space, optimal control, dynamical system, dynamic programming and optimal control, deterministic systems, shortest path, state information, rollout, stochastic shortest path, approximate dynamic programming