Orientation Independent Activity/Gesture Recognition Using Wearable Motion Sensors Academic Article uri icon

abstract

  • 2014 IEEE. Activity/gesture recognition using wearable motion sensors, also known as inertial measurement units (IMUs), provides an important context for many ubiquitous sensing applications. The design of the activity/gesture recognition algorithms typically requires information about the placement and orientation of the IMUs on the body, and the signal processing is often designed to work with a known orientation and placement of the sensors. However, sensors could be worn or held differently. Therefore, signal processing algorithms may not perform as well as expected. In this paper, we present an orientation independent activity/gesture recognition approach by exploring a novel feature set that functions irrespective of how the sensors are oriented. We also propose a template refinement technique to determine the best representative segment of each gesture thus improving the recognition accuracy. We evaluated our approach in the context of two applications: 1) activity of daily living recognition and 2) hand gesture recognition. The experimental results show that our approach achieves 98.2% and 95.6% average accuracies for subject dependent testing of activities of daily living and gestures, respectively.

published proceedings

  • IEEE INTERNET OF THINGS JOURNAL

author list (cited authors)

  • Wu, J., & Jafari, R.

citation count

  • 42

complete list of authors

  • Wu, Jian||Jafari, Roozbeh

publication date

  • April 2019