Robustness to Lighting Variations: An RGB-D Indoor Visual Odometry Using Line Segments Conference Paper uri icon


  • 2015 IEEE. Large lighting variation challenges all visual odometry methods, even with RGB-D cameras. Here we propose a line segment-based RGB-D indoor odometry algorithm robust to lighting variation. We know line segments are abundant indoors and less sensitive to lighting change than point features. However, depth data are often noisy, corrupted or even missing for line segments which are often found on object boundaries where significant depth discontinuities occur. Our algorithm samples depth data along line segments, and uses a random sample consensus approach to identify correct depth and estimate 3D line segments. We analyze 3D line segment uncertainties and estimate camera motion by minimizing the Mahalanobis distance. In experiments we compare our method with two state-of-the-art methods including a keypoint-based approach and a dense visual odometry algorithm, under both constant and varying lighting. Our method demonstrates superior robustness to lighting change by outperforming the competing methods on 6 out of 8 long indoor sequences under varying lighting. Meanwhile our method also achieves improved accuracy even under constant lighting when tested using public data.

name of conference

  • 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)

published proceedings


author list (cited authors)

  • Lu, Y., & Song, D.

citation count

  • 22

complete list of authors

  • Lu, Yan||Song, Dezhen

publication date

  • January 2015