Skip Navigation Links
Journal of Applied Nonlinear Dynamics
Miguel A. F. Sanjuan (editor), Albert C.J. Luo (editor)
Miguel A. F. Sanjuan (editor)

Department of Physics, Universidad Rey Juan Carlos, 28933 Mostoles, Madrid, Spain

Email: miguel.sanjuan@urjc.es

Albert C.J. Luo (editor)

Department of Mechanical and Industrial Engineering, Southern Illinois University Ed-wardsville, IL 62026-1805, USA

Fax: +1 618 650 2555 Email: aluo@siue.edu


On Emotion Recognition Through Dynamic Directed Network and Machine Learning Based on EEG

Journal of Applied Nonlinear Dynamics 13(4) (2024) 805--821 | DOI:10.5890/JAND.2024.12.013

Zhiyi Jing, Yeyin Xu, Qiang Fan,Ying Wu

School of Aerospace Engineering, Xi'an Jiaotong University, Xi'an, 710049, PR China

Download Full Text PDF

 

Abstract

Information flow in a brain functional network has significant effects on the causality between brain regions and emotion generation. Investigation of such causal relationships under different emotional states is key to reveal the emotion generation mechanisms. In this research, the dynamic directed networks are constructed by transfer entropy method based on DEAP electroencephalogram emotion data set. The information exchanging of the brain network is captured in short-term time scales. The functional separation and integration ability of brain, information flow and robustness of the network in different emotion states are analyzed. The results found that in the high arousal-high valence and low arousal-high valence states, information separation and integration ability became stronger and the robustness turned high. In the same emotional states, the information flow of the brain regions at all directions varied synchronously. Kinds of machine learning methods combined with the characteristics of the dynamic network are adopted to conduct emotion recognition of testees. Compared with the results of different classifiers in emotion recognition, the classifier built by support vector machine had a higher accuracy. With the accuracy of 96.9\% of subject-dependent two-classification, a high precision classifier is successfully achieved in the research which provides an effective method for the future investigation on emotion classification and recognition.

Acknowledgments

This study is funded by the National Nature Science Foundation of China (Grant No. 12132012 and 11972275).

References

  1. [1]  Dolan, R. (2002), Neuroscience and psychology: Emotion, cognition, and behavior, Science, 298, 1191-1194.
  2. [2]  Van Den~Broek, E. (2013), Ubiquitous emotion-aware computing, Personal and Ubiquitous Computing, 17, 53-67.
  3. [3]  Russell, J. (1980), A circumplex model of affect, Journal of Personality and Social Psychology, 39, 1161-1178.
  4. [4]  Wang, Y., Yao, L., and Zhao, X. (2020), Amygdala network in response to facial expression following neurofeedback training of emotion, Brain Imaging and Behavior, 14, 897-906.
  5. [5]  Ge, H., Zhu, Z., Dai, Y., Wang, B., and Wu, X. (2022), Facial expression recognition based on deep learning, Computer Methods and Programs in Biomedicine, 215, 106621.
  6. [6]  Black, M.H., Chen, N.T., Iyer, K.K., Lipp, O.V., Bölte, S., Falkmer, M., Tan, T., and Girdler, S. (2017), Mechanisms of facial emotion recognition in autism spectrum disorders: Insights from eye tracking and electroencephalography, Neuroscience and Biobehavioral Reviews, 80, 488–515.
  7. [7]  Yao, L., Wang, M., Lu, Y., Li, H., and Zhang, X. (2021), Eeg-based emotion recognition by exploiting fused network entropy measures of complex networks across subjects, Entropy, 23, 984.
  8. [8]  Cai, Q., Cui, G.-C., and Wang, H.-X. (2022), Eeg-based emotion recognition using multiple kernel learning, Machine Intelligence Research, 19, 472-484.
  9. [9]  Wu, X., Zheng, W.-L., Li, Z., and Lu, B.-L. (2022), Investigating eeg-based functional connectivity patterns for multimodal emotion recognition, Journal of Neural Engineering, 19, 016012.
  10. [10]  Wang, H., Wu, X., and Yao, L. (2022), Identifying cortical brain directed connectivity networks from high-density eeg for emotion recognition, IEEE Transactions on Affective Computing, 13, 1489-1500.
  11. [11]  Xing, M., Tadayonnejad, R., Macnamara, A., Ajilore, O., Phan, K., Klumpp, H., and Leow, A. (2016), Eeg based functional connectivity reflects cognitive load during emotion regulatio, In 2016 IEEE 13th International Symposium on Biomedical Imaging (ISBI), 771-774.
  12. [12]  Ramzan, M. and Dawn, S. (2019), Learning-based classification of valence emotion from electroencephalography, International Journal of Neuroscience, 129, 1085–1093.
  13. [13]  Liu, J., Meng, H., Nandi, A., and Li, M. (2016), Emotion detection from eeg recordings, In 2016 12th International Conference on Natural Computation, Fuzzy Systems and Knowledge Discovery (ICNC-FSKD), 1722-1727.
  14. [14]  Zhang, J., Chen, M., Zhao, S., Hu, S., Shi, Z., and Cao, Y. (2016), Relieff-based eeg sensor selection methods for emotion recognition, Sensors, 16(10), 1-15.
  15. [15]  Li, R., Ren, C., Zhang, S., Yang, Y., Zhao, Q., Hou, K., Yuan, W., Zhang, X., and Hu, B. (2023), Stsnet: a novel spatio-temporal-spectral network for subject-independent eeg-based emotion recognition, Health Information Science and Systems, 11(1), 25.
  16. [16]  Priyasad, D., Fernando, T., Denman, S., Sridharan, S., and Fookes, C. (2022), Affect recognition from scalp-eeg using channel-wise encoder networks coupled with geometric deep learning and multi-channel feature fusion, Knowledge-Based Systems, 250, p.109038
  17. [17]  Li, C., Wang, B., Zhang, S., Liu, Y., Song, R., Cheng, J., and Chen, X. (2022), Emotion recognition from eeg based on multi-task learning with capsule network and attention mechanism, Computers in Biology and Medicine, 143, p.105303.
  18. [18]  Cui, G., Li, X., and Touyama, H. (2023), Emotion recognition based on group phase locking value using convolutional neural network, Scientific Reports, 13, p.3769.
  19. [19]  Islam, M., Islam, M., Rahman, M., Mondal, C., Singha, S., Ahmad, M., Awal, A., Islam, M., and Moni, M. (2021), Eeg channel correlation based model for emotion recognition, Computers in Biology and Medicine, 136, p.104757.
  20. [20]  Zhang, S., Hu, B., Ji, C., Zheng, X., and Zhang, M. (2020), Functional connectivity network based emotion recognition combining sample entropy, IFAC-PapersOnLine, 53, 458-463.
  21. [21]  Koelstra, S., Mühl, C., Soleymani, M., Lee, J.-S., Yazdani, A., Ebrahimi, T., Pun, T., Nijholt, A., and Patras, I. (2012), Deap: A database for emotion analysis; using physiological signals, IEEE Transactions on Affective Computing, 3, 18-31.
  22. [22]  Schreiber, T. (2000), Measuring information transfer, Physical Review Letters, 85, 461-464.
  23. [23]  Kaiser, A. and Schreiber, T. (2002), Information transfer in continuous processes, Physica D: Nonlinear Phenomena, 166, 43-62.
  24. [24]  Latora, V. and Marchiori, M. (2001), Efficient behavior of small-world networks. Physical Review Letters, 87, 198701/1-198701/4.
  25. [25]  Bullmore, E. and Sporns, O. (2009), Complex brain networks: Graph theoretical analysis of structural and functional systems, Nature Reviews Neuroscience, 10, 186-198.
  26. [26]  Watts, D. and Strogatz, S. (1998), Collective dynamics of small-world networks, Nature, 393, 440-442.
  27. [27]  Sporns, O., Honey, C., and Kötter, R. (2007), Identification and classification of hubs in brain networks, Plos One, 2(10), p.e1049.
  28. [28]  Freeman, L. (1978), Centrality in social networks conceptual clarification, Social Networks, 1, 215-239.
  29. [29]  Cheshire, J. (2010), A first course in bayesian statistical methods, Journal of the Royal Statistical Society Series A-Statistics in Society, 173, 694-695.
  30. [30]  Zhang, R., Du, T., Qu, S., and Sun, H. (2021), Adaptive density-based clustering algorithm with shared knn conflict game, Information Sciences, 565, 344-369.
  31. [31]  Breiman, L., Friedman, J., Olshen, R., and Stone, C. (2017), Classification and Regression Trees, 1(1), 14-23.
  32. [32]  Soleymani, M., Lichtenauer, J., Pun, T., and Pantic, M. (2012), A multimodal database for affect recognition and implicit tagging, IEEE Transactions on Affective Computing, 3, 42-55.