Collision Avoidance for Unmanned Aircraft using Markov Decision Processes
Author(s)
Temizer, Selim; Kochenderfer, Mykel J.; Kaelbling, Leslie P.; Lozano-Perez, Tomas; Kuchar, James K.
DownloadKaelbling_Collision avoidance.pdf (2.244Mb)
OPEN_ACCESS_POLICY
Open Access Policy
Creative Commons Attribution-Noncommercial-Share Alike
Terms of use
Metadata
Show full item recordAbstract
Before unmanned aircraft can
fly safely in civil airspace, robust airborne collision avoidance systems must be developed. Instead of hand-crafting a collision avoidance algorithm
for every combination of sensor and aircraft con guration, we investigate the automatic
generation of collision avoidance algorithms given models of aircraft dynamics, sensor performance, and intruder behavior. By formulating the problem of collision avoidance as a
Markov Decision Process (MDP) for sensors that provide precise localization of the intruder aircraft, or a Partially Observable Markov Decision Process (POMDP) for sensors
that have positional uncertainty or limited field-of-view constraints, generic MDP/POMDP
solvers can be used to generate avoidance strategies that optimize a cost function that balances
flight-plan deviation with collision. Experimental results demonstrate the suitability
of such an approach using four diff erent sensor modalities and a parametric aircraft performance model.
Description
URL to conference site
Date issued
2010-08Department
Lincoln Laboratory; Massachusetts Institute of Technology. Department of Electrical Engineering and Computer ScienceJournal
AIAA Guidance, Navigation, and Control Conference 2 - 5 August 2010, Toronto, Ontario Canada
Publisher
American Institute of Aeronautics and Astronautics
Citation
Temizer, Selim, et al. "Collision Avoidance for Unmanned Aircraft using Markov Decision Processes." AIAA Guidance, Navigation, and Control Conference, Toronto, Ontario, Aug. 2-5, 2010
Version: Original manuscript
Other identifiers
AIAA 2010-8040