Document Type
Original Study
Keywords
Computer Engineering
Abstract
Emotion recognition through body movement presents a compelling alternative to traditional facial and vocal analysis in the field of emotion recognition . This study introduces a deep learning-based framework that utilizes upper-body movements features to classify emotional states using ConVolutional Neural Networks (CNNs). The system is trained and evaluated on the Body Language Dataset (BoLD) dataset, targeting seven primary emotions: happiness, sadness, anger, fear, surprise, joy, and disgust. The proposed model achieves a classification accuracy of 95.72%, significantly outperforming existing methods in bodybased emotion recognition. The result highlights the potential of body posture as a reliable and standalone modality for emotion detection, especially in contexts where facial information is ambiguous or unavailable. The presented work contributes to the advancement of non-intrusive, vision-based affective computing systems, with promising applications in human-computer interaction, behavioral analysis, security, and assistive technologies. Future research may explore multimodal integration, real-time deployment, and cross-cultural adaptability to further enhance the robustness and versatility of body-driven emotion recognition systems.
How to Cite This Article
Waleed, Gheed Tawfeeq and Hameed, Shaymaa
(2025)
"Emotion Recognition from Upper-Body Movements with a Pose Collage CNN Architecture,"
Iraqi Journal of Computers, Communications, Control and Systems Engineering: Vol. 25:
Iss.
2, Article 5.
DOI: 10.33103/uot.ijccce.25.2.5
Available at:
https://ijccce.researchcommons.org/journal/vol25/iss2/5