Distributed recognition of human actions using wearable motion sensor networks
Academic Article
Overview
Research
Identity
Additional Document Info
Other
View All
Overview
abstract
We propose a distributed recognition framework to classify continuous human actions using a low-bandwidth wearable motion sensor network, called distributed sparsity classifier (DSC). The algorithm classifies human actions using a set of training motion sequences as prior examples. It is also capable of rejecting outlying actions that are not in the training categories. The classification is operated in a distributed fashion on individual sensor nodes and a base station computer. We model the distribution of multiple action classes as a mixture subspace model, one subspace for each action class. Given a new test sample, we seek the sparsest linear representation of the sample w.r.t. all training examples. We show that the dominant coefficients in the representation only correspond to the action class of the test sample, and hence its membership is encoded in the sparse representation. Fast linear solvers are provided to compute such representation via 1-minimization. To validate the accuracy of the framework, a public wearable action recognition database is constructed, called wearable action recognition database (WARD). The database is comprised of 20 human subjects in 13 action categories. Using up to five motion sensors in the WARD database, DSC achieves state-of-the-art performance. We further show that the recognition precision only decreases gracefully using smaller subsets of active sensors. It validates the robustness of the distributed recognition framework on an unreliable wireless network. It also demonstrates the ability of DSC to conserve sensor energy for communication while preserve accurate global classification. 2009 IOS Press and the authors. All rights reserved.