Emotion recognition using deep neural networks and dynamic features of EEG signal

Document Type : Research Paper

Authors

1 Department of Computer Engineering, Kerman Branch, Islamic Azad University, Kerman, Iran

2 Department of Computer Engineering, Bardsir Branch, Islamic Azad University, Bardsir, Iran

Abstract

Emotions play an important role in human life. The design of systems capable of recognizing different emotions has been considered. To design the system, the DEAP database, which includes the physiological signals of EEG, ECG, EMG, and breathing rate recorded from 32 male and female participants, with the application of video music stimulation over a period of 120 minutes, has been used. For every minute, labelling is done based on a face video review by an expert. Four classes were considered for emotions: anger, happiness, sadness and satisfaction. In the first stage of system implementation, EEG signals obtained by the DEAP method with a sampling rate of 512 Hz are subject to noise removal and filtering. For this purpose, two filtering stages are performed, one in the frequency domain and the other in the time-frequency domain. To remove noise with unknown signal sources, time-frequency domain filtering (wavelet transform) with default coefficients of MATLAB software is used. In the next step, the alpha, beta, and gamma sub-bands of the EEG signal are extracted based on the detail and general coefficients of the Daubechies wavelet transform with four or eight decomposition levels. Then the signal is reconstructed with the desired coefficients. Linear and dynamic features are extracted from the alpha, beta and gamma sub-bands of the EEG signal. By determining the desired features, the extracted features are applied in the first step as input to common classification methods such as decision tree, nearest neighbour and support vector machine, and in the next step, they are applied as input to the convolutional deep learning network. Also, in a review, EEG signals are considered as input for deep classification structure. The goal is to evaluate the results of deep learning networks and other methods for emotion classification. According to the obtained results, the support vector machine method achieved the highest classification accuracy for identifying four emotional states with 94.1% accuracy. Also, the proposed convolutional neural network identified the desired emotional states with 80% accuracy. The performance of the deep learning network will be improved if more features are used; In addition, the deep learning method has significant advantages over simple classification methods due to being resistant to noise and automatic processing.

Keywords

[1] M.S. Aldayel, M. Ykhlef, and A.N. Al-Nafjan, Electroencephalogram-based preference prediction using deep transfer learning, IEEE Access 8 (2020), 176818–176829.
[2] M. Ali, A.H. Mosa, F. Al Machot, and K. Kyamakya, EEG-based emotion recognition approach for e-healthcare applications, Eighth Int. Conf. Ubiquitous Future Networks (ICUFN), IEEE, 2016, pp. 946–950.
[3] M.D. Bengalur and A.K. Saxena, A systematic review on approaches to recognize emotions using electroencephalography (EEG) signals, V. Bhateja, S.C. Satapathy, C.M. Travieso-Gonzalez and V.N.M. Aradhya, (eds) Data
engineering and intelligent computing, Adv. Intell. Syst. Comput. Springer, Singapore, 2021.
[4] C. Hondrou and G. Caridakis, Affective, natural interaction using EEG: Sensors, application and future directions, Hellenic Conf. Artificial Intell., Springer, 2012, pp. 31–338.
[5] K.-Y. Huang, C.-H. Wu, Q.-B. Hong, M.-H. Su, and Y.-H. Chen, Speech emotion recognition using deep neural network considering verbal and nonverbal speech sounds, IEEE Int. Conf. Acoustics Speech Signal Process. (ICASSP), IEEE, 2019, pp. 5866–5870.
[6] H. Kantz, A robust method to estimate the maximal Lyapunov exponent of a time series, Phys. Lett. A 185 (1994), no. 1, 77–87.
[7] S. Koelstra, C. Muhl, M. Soleymani, J.-S. Lee, A. Yazdani, T. Ebrahimi, T. Pun, A. Nijholt, and I. Patras, Deap: A database for emotion analysis; using physiological signals, IEEE Trans. Affect. Comput. 3 (2011), no. 1, 18–31.
[8] T. Kusumaningrum, A. Faqih, and B. Kusumoputro, Emotion recognition based on DEAP database using EEG time-frequency features and machine learning methods, J. Phys.: Conf. Ser. IOP Pub. 1501 (2020), no. 1, 012020.
[9] X. Li, D. Song, P. Zhang, G. Yu, Y. Hou, and B. Hu, Emotion recognition from multi-channel EEG data through convolutional recurrent neural network, IEEE Int. Conf. Bioinf. Biomed. (BIBM), IEEE, 2016, pp. 352–359.
[10] M.S. Safi and S.M.M. Safi, Early detection of Alzheimer’s disease from EEG signals using Hjorth parameters, Biomed. Signal Process. Control 65 (2021), 102338.
[11] S. Shadravan, H.R. Naji and V.K. Bardsiri, The sailfish optimizer: A novel nature-inspired metaheuristic algorithm for solving constrained engineering optimization problems, Eng. Appl. Artif. Intell. 80 (2019), 20–34.
[12] I. Shahin, A.B. Nassif, and S. Hamsa, Emotion recognition using hybrid Gaussian mixture model and deep neural network, IEEE Access 7 (2019), 26777–26787.
[13] M. Yanagimoto, C. Sugimoto, and T. Nagao, Frequency filter networks for EEG-based recognition, IEEE Int. Conf. Syst. Man Cybernet. (SMC), IEEE, 2017, pp. 270–275.
[14] J. Zhang, M. Chen, S. Hu, Y. Cao, and R. Kozma, PNN for EEG-based emotion recognition, IEEE Int. Conf. Syst. Man Cybernet. (SMC), IEEE, 2016, pp. 2319–2323.
Volume 15, Issue 7
July 2024
Pages 299-307
  • Receive Date: 02 April 2023
  • Revise Date: 22 May 2023
  • Accept Date: 06 June 2023