Open Access Journal

ISSN : 2394-2320 (Online)

International Journal of Engineering Research in Computer Science and Engineering (IJERCSE)

Monthly Journal for Computer Science and Engineering

Open Access Journal

International Journal of Engineering Research in Computer Science and Engineering (IJERCSE)

Monthly Journal for Computer Science and Engineering

ISSN : 2394-2320 (Online)

Emotion Recognition Using Physiological Signals and Different Datasets

Author : Emily Joy 1 Willson Joseph 2 Dr. M. Rajeswari 3 Dr. R. Sunder 4

Date of Publication :20th May 2021

Abstract: Now a day’s emotion recognition has an important role in day-to-day life. Because emotions have greater value in human life. From the emotions, we can check whether the person is healthy or not and also the mental stability. For this mainly 8 basic emotions are fear, disgust, anger, happiness, sadness, acceptance, expectancy, and surprise. It also includes emotions like interest, guilt, shame, and neglect. To recognize these emotions mainly face recognition methods and speech recognition methods are used. But they are mainly in attention to facial expressions, speech, and gestures. By using visible signs of emotions cannot achieve the actual emotions of people. To gain true emotions, we are using Emotion Recognition using Physiological signals. In this paper, we use photoplethysmography (PPG), and galvanic skin response (GSR) signals as the data for recognizing the emotions and for classification. We use different classification methods to acquire the best accuracy. By this, we obtain emotions like sad, happy, and neutral. Using a modified random forest method, we achieve high accuracy of 97% for classification.

Reference :

    1. J. A. Dominguez- Jimenez, K. C. Campo- Landines, J. C. Martinez-Santos, E. J. Delahoz, S. H. ContrerasOrtiz, “A machine learning model for emotion recognition from physiological signals”, Elsevier, 2020.
    2. Xin Xu, Yiwei Zhang, Minghong Tangn, Hong Gu, Shancheng Yan, and JieYang, “Emotion recognition based on double tree complex wavelet transform and machine learning in the internet of things”, IEEE Access, 2019.
    3. Guijun Chen, Xueying Zhang, Ying Sun, and Jing Zhang, “Emotion feature analysis and recognition based on reconstructed EEG sources”, IEEE Access, 2019. [4] Tengfei Song, Wenming Zheng, Cheng Lu, Yuan Zong, Xilei Zhang, and Zhen Cui, “MPED: A multimodal physiological emotion database for discrete emotion recognition”, IEEE Access, 2019.
    4. Amani Albraikan, Diana P. Tobon and AbdulmotalebEl Saddik, “Toward user-independent emotion recognition using physiological signals”, IEEE Sensors Journal, 2019.
    5. Chunmei Qing, Rui Qiao, Xiangmin Xu, and Yongqiang cheng, “Interpretable emotion recognition using EEG signals”, IEEE Transactions on multimedia,2019.
    6. R. Cowie, R.R. Cornelius, Describing the emotional states that are expressed in speech, Speech Commun. 40 (1–2) (2003) 5–32.
    7. M.A. Turk, A.P. Pentland, Face recognition using eigenfaces, in: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 1991. Proceedings CVPR’91, IEEE, 1991, pp. 586–591.
    8. A.P. Atkinson, M.L. Tunstall, W.H. Dittrich, Evidence for distinct contributions of form and motion information to the recognition of emotions from body gestures, Cognition 104 (1) (2007) 59–72.
    9. A. Heraz, M. Clynes, Recognition of emotions conveyed by touch through force-sensitive screens: observational study of humans and machine learning techniques, JMIR Mental Health 5 (3) (2018), e10104, http://dx.doi.org/10. 2196/10104.
    10. N. Sebe, I. Cohen, T.S. Huang, Multimodal emotion recognition, in: Handbook of Pattern Recognition and Computer Vision, World Scientific, 2005, pp. 387– 409.
    11. M. Soleymani, M. Pantic, T. Pun, Multimodal emotion recognition in response to videos, IEEE Trans. Affect. Comput. 3 (2) (2012) 211–223.
    12. O. Alaoui-Ismaïli, O. Robin, H. Rada, A. Dittmar, E. Vernet-Maury, Basic emotions evoked by odorants: comparison between autonomic responses and selfevaluation, Physiol. Behav. 62 (4) (1997) 713–720.
    13. R.W. Levenson, L.L. Carstensen, W.V. Friesen, P. Ekman, Emotion, physiology, and expression in old age, Psychol. Aging 6 (1) (1991) 28.
    14. I.C. Christie, B.H. Friedman, Autonomic specificity of discrete emotion and dimensions of affective space: a multivariate approach, Int. J. Psychophysiol. 51 (2) (2004) 143–153.
    15. R.W. Picard, J. Healey, Affective wearables, Pers. Technol. 1 (4) (1997) 231–240.
    16. J. Scheirer, R. Fernandez, R.W. Picard, Expression glasses: a wearable device for facial expression recognition, in: CHI’99 Extended Abstracts on Human Factors in Computing Systems, ACM, 1999, pp. 262–263.
    17. A. Haag, S. Goronzy, P. Schaich, J. Williams, Emotion recognition using bio-sensors: first steps towards an automatic system, in: Tutorial and Research Workshop on Affective Dialogue Systems, Springer, 2004, pp. 36–48.
    18.  T. Hui, R. Sherratt, Coverage of emotion recognition for common wearable biosensors, Biosensors 8 (2) (2018) 30, http://dx.doi.org/10.3390/ bios8020030.
    19. W. Cai, Y. Li and X. Shao, Chemom. Intell. Lab. Syst., 2008, 90, 188–194

Recent Article