A Fitness Monitoring System based on Fusion of Visual and Sensorial Information

Printer-friendly versionSend by email
Conference Proceedings (fully refereed)
21
6
2017
Papakostas
Papakostas, M., Giannakopoulos, T. & Karkaletsis, V.
We present a method that recognizes exercising activities performed by a single human in the context of a real home environment. Towards this end, we combine sensorial information stemming from a smartphone accelerometer, with visual information from a simple web camera. Low-level features inspired from the audio analysis domain are used to represent the accelerometer data, while simple frame-wise features are used in the visual channel. Extensive experiments prove that the fusion approach achieves 95% of overall performance when user calibration is adopted, which is a 4% performance boosting compared to the best individual modality which is the accelerometer data.
Software and Knowledge Engineering Laboratory (SKEL)
Conference Short Name: 
PETRA 2017
Conference Full Name: 
10th International Conference on PErvasive Technologies Related to Assistive Environments
Conference Country: 
GR:Greece
Conference City: 
Island of Rhodes
Conference Date(s): 
Mon, 26/06/2017 - Thu, 29/06/2017
Conference Level: 
International
Publisher: 
ACM
Page Start: 
280
Page End: 
285

© 2018 - Institute of Informatics and Telecommunications | National Centre for Scientific Research "Demokritos"