Predicting Emotions in Facial Expressions from the Annotations in Naturally Occurring First Encounters

Research output: Contribution to journalJournal articleResearchpeer-review

Standard

Predicting Emotions in Facial Expressions from the Annotations in Naturally Occurring First Encounters. / Navarretta, Costanza.

In: Knowledge-Based Systems, Vol. 71, 2014, p. 34-40.

Research output: Contribution to journalJournal articleResearchpeer-review

Harvard

Navarretta, C 2014, 'Predicting Emotions in Facial Expressions from the Annotations in Naturally Occurring First Encounters', Knowledge-Based Systems, vol. 71, pp. 34-40. https://doi.org/10.1016/j.knosys.2014.04.034

APA

Navarretta, C. (2014). Predicting Emotions in Facial Expressions from the Annotations in Naturally Occurring First Encounters. Knowledge-Based Systems, 71, 34-40. https://doi.org/10.1016/j.knosys.2014.04.034

Vancouver

Navarretta C. Predicting Emotions in Facial Expressions from the Annotations in Naturally Occurring First Encounters. Knowledge-Based Systems. 2014;71:34-40. https://doi.org/10.1016/j.knosys.2014.04.034

Author

Navarretta, Costanza. / Predicting Emotions in Facial Expressions from the Annotations in Naturally Occurring First Encounters. In: Knowledge-Based Systems. 2014 ; Vol. 71. pp. 34-40.

Bibtex

@article{589cfa339bb94f4f892a49c734e08b8c,
title = "Predicting Emotions in Facial Expressions from the Annotations in Naturally Occurring First Encounters",
abstract = "This paper deals with the automatic identification of emotions from the manual annotations of the shape and functions of facial expressions in a Danish corpus of video recorded naturally occurring first encounters. More specifically, a support vector classified is trained on the corpus annotations to identify emotions in facial expressions. In the classification experiments, we test to what extent emotions expressed in naturally-occurring conversations can be identified automatically by a classifier trained on the manual annotations of the shape of facial expressions and co-occurring speech tokens. We also investigate the relation between emotions and the communicative functions of facial expressions. Both emotion labels and their values in a three dimensional space are identified. The three dimensions are Pleasure, Arousal and Dominance. The results of our experiments indicate that the classifiers perform well in identifying emotions from the coarse-grained descriptions of facial expressions and co-occurring speech. The communicative functions of facial expressions also contribute to emotion identification. The results are promising because the emotion label list comprises fine grained emotions and affective states in naturally occurring conversations, while the the shape features of facial expressions are very coarse grained. The classification results also assess that the annotation scheme combining a discrete and a dimensional description, and the manual annotations produced according to it are reliable and can be used to model and test emotional behaviours in emotional cognitive infocommunicative systems.",
author = "Costanza Navarretta",
year = "2014",
doi = "10.1016/j.knosys.2014.04.034",
language = "English",
volume = "71",
pages = "34--40",
journal = "Knowledge-Based Systems",
issn = "0950-7051",
publisher = "Elsevier",

}

RIS

TY - JOUR

T1 - Predicting Emotions in Facial Expressions from the Annotations in Naturally Occurring First Encounters

AU - Navarretta, Costanza

PY - 2014

Y1 - 2014

N2 - This paper deals with the automatic identification of emotions from the manual annotations of the shape and functions of facial expressions in a Danish corpus of video recorded naturally occurring first encounters. More specifically, a support vector classified is trained on the corpus annotations to identify emotions in facial expressions. In the classification experiments, we test to what extent emotions expressed in naturally-occurring conversations can be identified automatically by a classifier trained on the manual annotations of the shape of facial expressions and co-occurring speech tokens. We also investigate the relation between emotions and the communicative functions of facial expressions. Both emotion labels and their values in a three dimensional space are identified. The three dimensions are Pleasure, Arousal and Dominance. The results of our experiments indicate that the classifiers perform well in identifying emotions from the coarse-grained descriptions of facial expressions and co-occurring speech. The communicative functions of facial expressions also contribute to emotion identification. The results are promising because the emotion label list comprises fine grained emotions and affective states in naturally occurring conversations, while the the shape features of facial expressions are very coarse grained. The classification results also assess that the annotation scheme combining a discrete and a dimensional description, and the manual annotations produced according to it are reliable and can be used to model and test emotional behaviours in emotional cognitive infocommunicative systems.

AB - This paper deals with the automatic identification of emotions from the manual annotations of the shape and functions of facial expressions in a Danish corpus of video recorded naturally occurring first encounters. More specifically, a support vector classified is trained on the corpus annotations to identify emotions in facial expressions. In the classification experiments, we test to what extent emotions expressed in naturally-occurring conversations can be identified automatically by a classifier trained on the manual annotations of the shape of facial expressions and co-occurring speech tokens. We also investigate the relation between emotions and the communicative functions of facial expressions. Both emotion labels and their values in a three dimensional space are identified. The three dimensions are Pleasure, Arousal and Dominance. The results of our experiments indicate that the classifiers perform well in identifying emotions from the coarse-grained descriptions of facial expressions and co-occurring speech. The communicative functions of facial expressions also contribute to emotion identification. The results are promising because the emotion label list comprises fine grained emotions and affective states in naturally occurring conversations, while the the shape features of facial expressions are very coarse grained. The classification results also assess that the annotation scheme combining a discrete and a dimensional description, and the manual annotations produced according to it are reliable and can be used to model and test emotional behaviours in emotional cognitive infocommunicative systems.

U2 - 10.1016/j.knosys.2014.04.034

DO - 10.1016/j.knosys.2014.04.034

M3 - Journal article

VL - 71

SP - 34

EP - 40

JO - Knowledge-Based Systems

JF - Knowledge-Based Systems

SN - 0950-7051

ER -

ID: 111097864