Our joint work on Multimodal Emotion Recognition with our friends at ICAR-CNR scores a new point: we had the paper “Exploiting Correlation between Body Gestures and Spoken Sentences for Real-time Emotion Recognition” accepted for publication and presentation at CHI-Italy 2017, which will be held in Cagliari, Italy, in September.
In this work we exploit the correlation between one’s affective state and the simultaneous body expressions in terms of speech and gestures. Here
we propose a system for real-time emotion recognition from gestures. It allows to build a trusted dataset of association pairs (motion data→emotion pattern), also based on textual information. Such dataset provides the ground truth for a further step, where emotion patterns can be extracted from new unclassified gestures. Experimental results demonstrate a good recognition accuracy and real-time capabilities of the
Check out the Publications section of our website for further details.