Our paper “Body Gestures and Spoken Sentences: A Novel Approach for Revealing User’s Emotions” is now available online through the IEEE Xplore Digital Library!
This is part of our ongoing joint work on Multimodal Emotion Recognition using Body Gestures (see the project page) along with our friends and colleagues at ICAR-CNR, Agnese Augello and Giovanni Pilato.
Many prior works showed that speech and gesture channels are internally coordinated towards conveying communicative intentions and make a unified meaning with the verbal part of an utterance. As a consequence, there should exist a sort of “correlation” between gestures
performed by a user and what s/he feels, thinks and says while moving his/her body: if a gesture and a sentence are simultaneous, then they can be considered somehow mutually correlated. Assuming that this correlation exists, then it is questionable whether there is a correlation between an emotion recognized from a sentence spoken while performing a gesture (via some textual emotion recognition
algorithm), and the emotion recognized using only that gesture.
Starting from this assumption, our paper describes a system for recognizing emotions from body gestures. To this purpose, RGB-D videos, audio and skeletal joint sequences (all gathered from a Kinect-like device) are used for gesture recognition, and then emotions are recognized for the associated spoken sentences.