REAL-TIME MAPPING AND LOCALIZATION UNDER DYNAMIC LIGHTING FOR SMALL-BODY LANDINGS Conference Paper uri icon

abstract

  • Copyright © 2015 California Institute of Technology. Small-body landing missions present difficult challenges to Guidance, Navigation, and Control (GNC) systems. A typical mission profile makes a distinction between a mapping phase and a terminal navigation phase. In the mapping phase, analysts on the ground process spacecraft sensor data to generate a geometric and visual model of the body. This model is used to pick a particular landing site. Then during the terminal navigation stage, the spacecraft must autonomously drive itself towards the landing site using the map. There are two major hurdles here. The first hurdle is that the visual appearance of the map will change as the direction to the sun changes in both the body-frame and the sensor-frame. The second hurdle is that smaller landing hazards on the ground only become visible due to improved spatial resolution as the spacecraft gets closer to the surface and are therefore not a part of the map made from greater standoff range. This paper presents a method to clear both of these hurdles. An algorithm to sequentially estimate the full geometry and texture of the local terrain about the landing site is developed. With this information and an estimate of sensor-to-inertial and body-to-inertial pose available, the terrain is efficiently rendered under the actual lighting and estimated relative sensor pose conditions. The rendered images are then compared to sensor images to perform pose estimate updates. Details of the map parameterization, rendering algorithm, pose estimation method, and filtering are presented. Laboratory experiments in a simulated scene with ground truth data are used to validate the algorithm.

author list (cited authors)

  • Conway, D., & Junkins, J. L.

publication date

  • January 2015