Online Intelligent Motion Video Guidance for Unmanned Air System Ground Target Surveillance Conference Paper uri icon

abstract

  • 2019, American Institute of Aeronautics and Astronautics Inc, AIAA. All rights reserved. The need for an intelligent Small Unmanned Air System to assist in the surveillance and tracking of ground targets of interest has spurred the development of unconventional solutions. The tracking of ground targets with a non-gimbaled or fixed camera is challenging since the vehicle must be steered to visually track targets. A Reinforcement Learning agent for the autonomous tracking of ground targets was developed for learning a control policy to keep a target within the camera image frame without user intervention. The agent uses the Q-Learning algorithm and functions as an outer-loop controller that commands bank angle increments to the autopilot. The pre-trained static Q-Learning algorithm trained in a simulation environment was flight tested and shown to be successful in tracking randomly maneuvering ground targets in unstructured and unmodeled environments. This paper details an improved training approach using real flight data to improve upon the controller trained in the simulation environment, the new system architecture, and challenges of this transition from pure simulation to learning in real-time during operation, hence online. Performance of both algorithms is evaluated in terms of ground target tracking time. Based upon the flight test results presented in the paper, the online approach is shown to be effective for tracking ground targets and is judged to be a candidate for an Unmanned Air System ground target tracking system.

name of conference

  • AIAA Scitech 2019 Forum

published proceedings

  • AIAA Scitech 2019 Forum

author list (cited authors)

  • Valasek, J., Lehman, H., & Goecks, V. G.

citation count

  • 1

complete list of authors

  • Valasek, John||Lehman, Hannah||Goecks, Vinicius G

publication date

  • January 2019