A Real-Time Human Action Recognition System Using Depth and Inertial Sensor Fusion Academic Article uri icon

abstract

  • 2001-2012 IEEE. This paper presents a human action recognition system that runs in real time and simultaneously uses a depth camera and an inertial sensor based on a previously developed sensor fusion method. Computationally efficient depth image features and inertial signals features are fed into two computationally efficient collaborative representative classifiers. A decision-level fusion is then performed. The developed real-time system is evaluated using a publicly available multimodal human action recognition data set by considering a comprehensive set of human actions. The overall classification rate of the developed real-time system is shown to be >97%, which is at least 9% higher than when each sensing modality is used individually. The results from both offline and real-time experimentations demonstrate the effectiveness of the system and its real-time throughputs.

published proceedings

  • IEEE SENSORS JOURNAL

altmetric score

  • 3

author list (cited authors)

  • Chen, C., Jafari, R., & Kehtarnavaz, N.

citation count

  • 123

complete list of authors

  • Chen, Chen||Jafari, Roozbeh||Kehtarnavaz, Nasser

publication date

  • February 2016