banner

Emotion sensitive analysis of learners’ cognitive state using deep learning

S. Aruna, Swarna Kuchibhotla

Abstract


The assessment of the state of mind of a student has traditionally been a troublesome task. The advances in deep learning have given analysts new opportunities to try and do therefore. Most state of mind methods focus principally on attention, failing to account for the significance of human emotions. Emotions are significant in laptop vision and a good deal of analysis is conducted exploitation human feelings. Our objective is to propose an emotion-sensitive analysis of individuals’ mental state, specifically focusing on students’ attention levels. This analysis will be carried out in a non-intrusive manner by detecting both head posture and emotions. To achieve this, we employ a multi-task learning approach that utilizes convolutional neural networks (CNNs). These networks are capable of simultaneously identifying facial expressions, locating facial landmarks, and estimating head position, all in real-time. Face alignment is additional assessed by estimating the pinnacle position and face alignment. The estimation of the pinnacle cause and alignment of the face is additional employed by the trainer to live the learner’s span. Experimental results show that the technique will accurately verify students’ emotions with a ninety-four accuracy rate.


Keywords


landmark location; cognitive state; head pose estimation

Full Text:

PDF

References


1. Ekatushabe M, Nsanganwimana F, Muwonge CM, Ssenyonga J. The relationship between cognitive activation, self-efficacy, achievement emotions and (meta) cognitive learning strategies among ugandan biology learners. African Journal of Research in Mathematics, Science and Technology Education 2021; 25(3): 247–258. doi: 10.1080/18117295.2021.2018867

2. Obergriesser S, Stoeger H. Students’ emotions of enjoyment and boredom and their use of cognitive learning strategies—How do they affect one another? Learning and Instruction 2020; 66: 101285. doi: 10.1016/j.learninstruc.2019.101285

3. D'errico F, Paciello M, De Carolis B, et al. Cognitive emotions in e-learning processes and their potential relationship with students’ academic adjustment. International Journal of Emotional Education 2018; 10(1): 89–111.

4. Raytchev B, Yoda I, Sakaue K. Head pose estimation by nonlinear manifold learning. In: Proceedings of the 17th International Conference on Pattern Recognition; 26 August 2004; Cambridge, UK. pp. 462–466.

5. Langton SRH, Honeyman H, Tessler E. The influence of head contour and nose angle on the perception of eye-gaze direction. Perception & Psychophysics 2004; 66: 752–771. doi: 10.3758/BF03194970

6. Baxter RH, Leach MJV, Mukherjee SS, Robertson NM. An adaptive motion model for person tracking with instantaneous head-pose features. IEEE Signal Processing Letters 2014; 22(5): 578–582. doi: 10.1109/LSP.2014.2364458

7. Shenoi VV, Kuchibhotla S, Kotturu P. An efficient state detection of a person by fusion of acoustic and alcoholic features using various classification algorithms. International Journal of Speech Technology 2020; 23: 625–632. doi: 10.1007/s10772-020-09726-7

8. Wu S, Wang B. Facial expression recognition based on computer deep learning algorithm: Taking cognitive acceptance of college students as an example. Journal of Ambient Intelligence and Humanized Computing 2021; 13(1): 45. doi: 10.1007/s12652-021-03113-z

9. Han ZM, Huang CQ, Yu JH, Tsai CC. Identifying patterns of epistemic emotions with respect to interactions in massive online open courses. Computers in Human Behavior 2021; 122: 106843. doi: 10.1016/j.chb.2021.106843

10. Karagiannopoulou E, Milienos FS, Rentzios C. Grouping learning approaches and emotional factors to predict students’ academic progress. International Journal of School & Educational Psychology 2022; 10(2): 258–275. doi: 10.1080/21683603.2020.1832941

11. Zhan Z, Shen T, Jin L, et al. Research on evaluation of online teaching effect based on deep learning technology. In: Proceedings of 2021 IEEE 5th Advanced Information Technology, Electronic and Automation Control Conference (IAEAC); 12–14 March 2021; Chongqing, China.

12. Myers MH. Automatic detection of a student’s affective states for intelligent teaching systems. Brain Sciences 2021; 11(3): 331. doi: 10.3390/brainsci11030331

13. Jyotsna C, Amudha J. Eye gaze as an indicator for stress level analysis in students. In: Proceedings of 2018 International Conference on Advances in Computing, Communications and Informatics (ICACCI); 19–22 September 2018; Bangalore, India. pp. 1588–1593.

14. Hasnine MN, Bui HTT, Tran TTT, et al. Students’ emotion extraction and visualization for engagement detection in online learning. Procedia Computer Science 2021; 192: 3423–3431. doi: 10.1016/j.procs.2021.09.115

15. Indhumathi R, Geetha A. Survey on recognition of head movements and facial emotions in e- learning system. International Journal of Scientific Research in Computer Science Applications and Management Studies 2018; 7(4).

16. Kalliatakis G, Stergiou A, Vidakis N. Conceiving human interaction by visualizing depth data of head pose changes and emotion recognition via facial expressions. Computers 2017; 6(3): 25. doi: 10.3390/computers6030025

17. Shivaranjani M. Emotion recognition framework using EEG signals for music persuaded activity. International Journal of Innovations in Scientific and Engineering Research 2020; 6(6): 75–82.

18. Lim JZ, Mountstephens J, Teo J. Emotion recognition using eye-tracking: Taxonomy, review and current challenges. Sensors 2020; 20(8): 2384. doi: 10.3390/s20082384

19. Kadam S, Dhawale V, Patil S. Facial gesture detection and eye tracking during virtual interview. Pramana Research Journal 2019; 9(6): 170.

20. Priyanka KS, Ravikumar G. Fake biometric detection applied to iris, fingerprint, and face recognition by using image quality assessment. International Journal of Innovations in Scientific and Engineering Research 2015; 2(3): 57–72.




DOI: https://doi.org/10.32629/jai.v7i2.790

Refbacks

  • There are currently no refbacks.


Copyright (c) 2023 S. Aruna, Swarna Kuchibhotla

License URL: https://creativecommons.org/licenses/by-nc/4.0/