Hybrid deep learning framework for human activity recognition

Document Type : Research Paper

Authors

1 Research Scholar, Dept. of ISE, Dr. Ambedkar Institute of Technology, Bengaluru, India

2 Dept. of CSE, Shri Krishna Institute of Technology, Bengaluru, India

Abstract

The aim of the recognition in the human activity is to recognize the actions of the individuals using a set of observations and their environmental conditions. Since last two decades, the research on this Human Activity Recognition (HAR) has captured the attention of several computer science communities because of the strength to provide support to different applications and the connection to different fields of study such as, human-computer interaction, healthcare, monitoring, entertainment and education. There are many existing methods like deep learning which have been used to develop to recognize the different activities of the human, but couldn’t identify the sudden change of the activities in the human. This paper presents a method using the deep learning methods which can recognize the specific identities and identify a change from one activity to another for the applications of the healthcare. In this method, a deep convolutional neural network is built using which the features are extracted for the collection of the data from the sensors. After which the Gated Recurrent Unit (GRU) captures the long-tern dependency between the different actions which helps to improve the identification rate of the HAR. From the CNN and GRU, a model of wearable sensor can be proposed which can identify the changes of the activities and can accurately recognize these activities. Experiment have been conducted using open-source University of California (UCI) HAR dataset which composed of six different activity such as lying, standing, sitting, walking downstairs, walking upstairs and walking. The CNN-based model achieves a detection accuracy of 95.99% whereas the CNN-GRU model achieves a detection accuracy of 96.79% which is better than most existing HAR methods.

Keywords

[1] A.A. Aljarrah and A.H. Ali, Human activity recognition using PCA and BiLSTM recurrent neural networks, in Proc. 2nd Int. Conf. Eng. Technol. its Appl. (IICETA), Al-Najef, Iraq, (2019) 156–160.
[2] L. Cantelli, G. Muscato, M. Nunnari and D. Spina, A jointangle estimation method for industrial manipulators using inertial sensors, IEEE/ASME Trans. Mechatron. 20(5) (2015) 2486–2495.
[3] Y. Chen and C. Shen, Performance analysis of smartphone-sensor behavior for human activity recognition, IEEE Access 5 (2017) 3095–3110.
[4] J. Chung, C. Gulcehre, K. Cho and Y. Bengio, Empirical evaluation of gated recurrent neural networks on sequence modelling, arXiv, (2014).
[5] M. Cornacchia, K. Ozcan, Y. Zheng and S. Velipasalar, A survey on activity detection and classification using wearable sensors, IEEE Sensors J. 17(2) (2017) 386–403.
[6] N. Davies, D.P. Siewiorek and R. Sukthankar, Activity-based computing, IEEE Pervas. Comput. 7(2) (2008) 58–61.
[7] N.B. Gaikwad, V. Tiwari, A. Keskar and N.C. Shivaprakash, Efficient FPGA implementation of multilayer perceptron for real-time human activity classification, IEEE Access, 7(2019) 26696–26706.
[8] K. Greff, R.K. Srivastava, J. Koutn´ık, B.R. Steunebrink and J. Schmidhuber, Lstm: A search space odyssey, arXiv, (2015).
[9] I.C. Gyllensten and A.G. Bonomi, Identifying types of physical activity with a single accelerometer: Evaluating laboratory-trained algorithms in daily life, IEEE Trans. Biomed. Eng. 58(9) (2011) 2656–2663.
[10] Y.-L. Hsu, J.-S. Wang and C.-W. Chang, A wearable inertial pedestrian navigation system with quaternion-based extended Kalman filter for pedestrian localization, IEEE Sensors J. 17(10) (2017) 3193–3206.
[11] A. Jain and V. Kanhangad, Human activity classification in smartphones using accelerometer and gyroscope sensors, IEEE Sensors J. 18(3) (2018) 1169–1177.
[12] M. Janidarmian, A. Roshan Fekr, K. Radecka and Z. Zilic, A comprehensive analysis on wearable acceleration sensors in human activity recognition, Sensors, 17(3) (2017) 529.
[13] E. Kantoch, Human activity recognition for physical rehabilitation using wearable sensors fusion and artificial neural networks, in Proc. Comput. Cardiol. Conf. (CinC), Rennes, France (2017) 1–4.
[14] S. Khalifa, G. Lan, M. Hassan, A. Seneviratne and S.K. Das, HARKE: Human activity recognition from kinetic energy harvesting data in wearable devices, IEEE Trans. Mobile Comput. 17(6) (2018) 1353–1368.
[15] S.-M. Lee, S.M. Yoon and H. Cho, Human activity recognition from accelerometer data using convolutional neural network, in Proc. IEEE Int. Conf. Big Data Smart Comput. (BigComp), Jeju, Island (2017) 131–134.
[16] Y. Liu, L. Nie, L. Liu and D.S. Rosenblum, From action to activity: sensor-based activity recognition, Neurocomput. 181 (2016) 108–115.
[17] M.I.H. Lopez-Nava and M.M. Angelica, Wearable inertial sensors for human motion analysis: a review, IEEE Sensors J. 16(15) (2016).
[18] J. Margarito, R. Helaoui, A.M. Bianchi, F. Sartor and A.G. Bonomi, User-independent recognition of sports activities from a single wristworn accelerometer: A template-matching-based approach, IEEE Trans. Biomed. Eng. 63(4) (2016) 788–796.
[19] R. Poppe, Vision-based human motion analysis: An overview, Comput. Vis. Image Understand. 108(1–2) (2007) 4–18.
[20] N. Ravi, N. Dandekar, P. Mysore and M.L. Littman, Activity recognition from accelerometer data, in Proc. AAAI 5 (2005) 1541–1546.
[21] J.-L. Reyes-Ortiz, L. Oneto, A. Ghio, A. Sama, D. Anguita and X. Parra, Human activity recognition on smartphones with awareness of basic activities and postural transitions, in Artificial Neural Networks and Machine Learning-ICANN. Cham, Switzerland: Springer, (2014) 177–184.
[22] M. Schuster and K.K. Paliwal, Bidirectional recurrent neural networks, IEEE Trans Sig. Process, 1997.
[23] M. Seiffert, F. Holstein, R. Schlosser and J. Schiller, Next generation cooperative wearables: Generalized activity assessment computed fully distributed within a wireless body area network, IEEE Access 5 (2017) 16793–16807.
[24] A.S.A. Sukor, A. Zakaria and N.A. Rahim, Activity recognition using accelerometer sensor and machine learning classifiers, in Proc. IEEE 14th Int. Colloq. Signal Process. Appl. (CSPA), Batu Feringghi (2018) 233–238.
[25] N. Tufek and O. Ozkaya, A comparative research on human activity recognition using deep learning, in Proc. 27th Signal Process. Commun. Appl. Conf. (SIU), Sivas, Turkey, Apr. (2019) 1–4.
[26] N. Tufek, M. Yalcin, M. Altintas, F. Kalaoglu, Y. Li and S.K. Bahadir, Human action recognition using deep learning methods on limited sensory data, IEEE Sensors J. 20(6) (2020) 3101–3112.
[27] M. Ueda, H. Negoro, Y. Kurihara and K. Watanabe, Measurement of angular motion in golf swing by a local sensor at the grip end of a golf club, IEEE Trans. Human-Machine Syst. 43(4) (2013) 398–404.
[28] A. Wang, G. Chen, J. Yang, S. Zhao and C.-Y. Chang, A comparative study on human activity recognition using inertial sensors in a smartphone, IEEE Sensors J. 16(11) (2016) 4566–4578.
[29] J.-S.Wang, Y.-L. Hsu and J.-N. Liu, An inertial-measurement-unit-based pen with a trajectory reconstruction algorithm and its applications, IEEE Trans. Ind. Electron. 57(10) (2010) 3508–3521.
[30] K. Xia, J. Huang and H. Wang, LSTM-CNN architecture for human activity recognition, IEEE Access, 8(2020) 56855 56866.
[31] W. Xu, Y. Pang, Y. Yang and Y. Liu, Human activity recognition based on convolutional neural network, in Proc. 24th Int. Conf. Pattern Recognit. (ICPR), Beijing, China, (2018) 165–170.
[32] C.-T. Yen, J.-X. Liao and Y.-K. Huang, Human daily activity recognition performed using wearable inertial sensors combined with deep learning algorithms, IEEE Access 8(2020) 174105–174114.
[33] T. Zebin, P.J. Scully, N. Peek, A.J. Casson and K.B. Ozanyan, Design and implementation of a convolutional neural network on an edge computing smartphone for human activity recognition, IEEE Access 7 (2019) 133509–133520.
[34] H. Zhang, Z. Xiao, J. Wang, F. Li and E. Szczerbicki, A novel IoT-perceptive human activity recognition (HAR) approach using multihead convolutional attention, IEEE Internet Things J. 7(2) (2020) 1072–1080.
Volume 13, Issue 1
March 2022
Pages 1225-1237
  • Receive Date: 19 June 2021
  • Revise Date: 15 August 2021
  • Accept Date: 02 September 2021