Wearable sensor technologies are a key component in the design of applications for human activity recognition, in areas like healthcare, sports and safety. We present an iterative learning framework to classify human locomotion activities (e.g. walk, stand, lie and sit) extracted from the Opportunity dataset by implementing a data-driven architecture. Data collected by 12 3D acceleration sensors and 7 inertial measurement units are de-noised using a wavelet filter, prior to the extraction of features such as roll, pitch, yaw, single magnitude vector and the principal components (PCA-2D). Our intention is to combine these features pairwise, in order to extract the best candidates for building the training dataset. This iterative process is based on the Euclidean distances between each class member and the centroid of the corresponding cluster. The resulting dataset is used to identify the best learning parameters for a SVM multi-classifier that produces the lowest prediction error. The methodology presented in this paper produced a model accuracy of over 86% for activity classification, exceeding the values reported in other studies, while using a much lower number of training samples and being more robust to variations in the quality of input data.
Iterative Learning for Human Activity Recognition from Wearable Sensor Data
Published: 14 November 2016 by MDPI in 3rd International Electronic Conference on Sensors and Applications session Smart Systems and Structures
Keywords: Inertial measurement units, 3D acceleration sensors, wavelet filters, data-driven architecture, Iterative learning, multi-class classification