Collision Avoidance for Unmanned Aircraft using Markov Decision Processes
Author(s)Temizer, Selim; Kochenderfer, Mykel J.; Kaelbling, Leslie P.; Lozano-Perez, Tomas; Kuchar, James K.
MetadataShow full item record
Before unmanned aircraft can fly safely in civil airspace, robust airborne collision avoidance systems must be developed. Instead of hand-crafting a collision avoidance algorithm for every combination of sensor and aircraft con guration, we investigate the automatic generation of collision avoidance algorithms given models of aircraft dynamics, sensor performance, and intruder behavior. By formulating the problem of collision avoidance as a Markov Decision Process (MDP) for sensors that provide precise localization of the intruder aircraft, or a Partially Observable Markov Decision Process (POMDP) for sensors that have positional uncertainty or limited field-of-view constraints, generic MDP/POMDP solvers can be used to generate avoidance strategies that optimize a cost function that balances flight-plan deviation with collision. Experimental results demonstrate the suitability of such an approach using four diff erent sensor modalities and a parametric aircraft performance model.
URL to conference site
DepartmentLincoln Laboratory; Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
AIAA Guidance, Navigation, and Control Conference 2 - 5 August 2010, Toronto, Ontario Canada
American Institute of Aeronautics and Astronautics
Temizer, Selim, et al. "Collision Avoidance for Unmanned Aircraft using Markov Decision Processes." AIAA Guidance, Navigation, and Control Conference, Toronto, Ontario, Aug. 2-5, 2010