This paper describes the development of a Human Activity Recognition (HAR) system based on deep learning for classifying full body activities using inertial signals. The HAR system is divided in several modules: a preprocessing module for extracting relevant features from the inertial signals window by windows, a machine learning algorithm for classifying the windows and a postprocessing module for integrating the information along several windows. Regarding the preprocessing module, several transformations are implemented and evaluated. For the ML module, several algorithms are evaluated including several deep learning architectures. This evaluation has been carried-out over the HARTH dataset. This public dataset contains recordings from 22 participants wearing two 3-axial Axivity AX3 accelerometers for 2 hours in a free-living setting. Sixteen different activities were recorded and annotated accordingly. This paper describes the fine-tuning process of several machine learning algorithms and analyses their performance with different sets of activities. The best results show an accuracy of 90% and 93% for 12 and 9 activities respectively. These results have been compared to the results reported in previous worlks. To the author's knowledge, these analyses provide the best state of the art results over this public dataset. Additionally, this paper includes several analyses of the confusion between the different activities and the contribution of every accelerometer in the global performance.
Previous Article in event
Next Article in event
Full body activity recognition using inertial signals
Published:
26 November 2024
by MDPI
in 11th International Electronic Conference on Sensors and Applications
session Wearable Sensors and Healthcare Applications
https://doi.org/10.3390/ecsa-11-20511
(registering DOI)
Abstract:
Keywords: Human Activity Recognition; Wearable Sensors; Classifier Module; Inertial Signals; Deep Learning; Repetitive Movements; Gestures; Postures.