MIT Libraries homeMIT Libraries logoDSpace@MIT

MIT
View Item 
  • DSpace@MIT Home
  • MIT Open Access Articles
  • MIT Open Access Articles
  • View Item
  • DSpace@MIT Home
  • MIT Open Access Articles
  • MIT Open Access Articles
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Air-Combat Strategy Using Approximate Dynamic Programming

Author(s)
McGrew, James S.; How, Jonathan P.; Bush, Lawrence; Williams, Brian Charles; Roy, Nicholas
Thumbnail
DownloadRoy_Air Combat.pdf (1.483Mb)
OPEN_ACCESS_POLICY

Open Access Policy

Creative Commons Attribution-Noncommercial-Share Alike

Terms of use
Creative Commons Attribution-Noncommercial-Share Alike 3.0 http://creativecommons.org/licenses/by-nc-sa/3.0/
Metadata
Show full item record
Abstract
Unmanned Aircraft Systems (UAS) have the potential to perform many of the dangerous missions currently own by manned aircraft. Yet, the complexity of some tasks, such as air combat, have precluded UAS from successfully carrying out these missions autonomously. This paper presents a formulation of a level flight, fixed velocity, one-on-one air combat maneuvering problem and an approximate dynamic programming (ADP) approach for computing an efficient approximation of the optimal policy. In the version of the problem formulation considered, the aircraft learning the optimal policy is given a slight performance advantage. This ADP approach provides a fast response to a rapidly changing tactical situation, long planning horizons, and good performance without explicit coding of air combat tactics. The method's success is due to extensive feature development, reward shaping and trajectory sampling. An accompanying fast and e ffective rollout-based policy extraction method is used to accomplish on-line implementation. Simulation results are provided that demonstrate the robustness of the method against an opponent beginning from both off ensive and defensive situations. Flight results are also presented using micro-UAS own at MIT's Real-time indoor Autonomous Vehicle test ENvironment (RAVEN).
Date issued
2010-09
URI
http://hdl.handle.net/1721.1/67298
Department
Massachusetts Institute of Technology. Aerospace Controls Laboratory; Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory; Massachusetts Institute of Technology. Department of Aeronautics and Astronautics
Journal
Journal of Guidance, Control, and Dynamics
Publisher
American Institute of Aeronautics and Astronautics
Citation
McGrew, James S. et al. “Air-Combat Strategy Using Approximate Dynamic Programming.” Journal of Guidance, Control, and Dynamics 33 (2010): 1641-1654.
Version: Author's final manuscript
ISSN
0731-5090
1533-3884

Collections
  • MIT Open Access Articles

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

Login

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries homeMIT Libraries logo

Find us on

Twitter Facebook Instagram YouTube RSS

MIT Libraries navigation

SearchHours & locationsBorrow & requestResearch supportAbout us
PrivacyPermissionsAccessibility
MIT
Massachusetts Institute of Technology
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.