Vision Based Collaborative Localization for Multirotor Vehicles Conference Paper uri icon

abstract

  • 2016 IEEE. We present a framework for vision based localization for two or more multirotor aerial vehicles relative to each other. This collaborative localization technique is built upon a relative pose estimation strategy between two or more cameras with the capability of estimating accurate metric poses between each other even through fast motion and continually changing environments. Through synchronized feature detection and tracking with a robust outlier rejection process, classical multiple view geometry concepts have been utilized for obtaining scale-ambiguous relative poses, which are then refined through reconstruction and pose optimization to provide a metric estimate. Furthermore, we present the implementation details of this technique followed by a set of results which involves evaluation of the accuracy of the pose estimates through test cases in both simulated and real experiments. Test cases include keeping one camera stationary as the other is mounted on a quadrotor which is then flown through various types of trajectories. We also perform a quantitative comparison with a GPS/IMU localization technique to demonstrate the accuracy of our method.

name of conference

  • 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)

published proceedings

  • 2016 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS 2016)

author list (cited authors)

  • Vemprala, S., & Saripalli, S.

citation count

  • 8

complete list of authors

  • Vemprala, Sai||Saripalli, Srikanth

publication date

  • October 2016