•  
  •  
 

Document Type

Original Study

Keywords

Computer Engineering

Abstract

Emotion recognition through body movement presents a compelling alternative to traditional facial and vocal analysis in the field of emotion recognition . This study introduces a deep learning-based framework that utilizes upper-body movements features to classify emotional states using ConVolutional Neural Networks (CNNs). The system is trained and evaluated on the Body Language Dataset (BoLD) dataset, targeting seven primary emotions: happiness, sadness, anger, fear, surprise, joy, and disgust. The proposed model achieves a classification accuracy of 95.72%, significantly outperforming existing methods in bodybased emotion recognition. The result highlights the potential of body posture as a reliable and standalone modality for emotion detection, especially in contexts where facial information is ambiguous or unavailable. The presented work contributes to the advancement of non-intrusive, vision-based affective computing systems, with promising applications in human-computer interaction, behavioral analysis, security, and assistive technologies. Future research may explore multimodal integration, real-time deployment, and cross-cultural adaptability to further enhance the robustness and versatility of body-driven emotion recognition systems.

Share

COinS