Dialogue Acts and Emotions in Multimodal Dyadic Conversations
Publikation: Bidrag til bog/antologi/rapport › Konferencebidrag i proceedings › Forskning › fagfællebedømt
Standard
Dialogue Acts and Emotions in Multimodal Dyadic Conversations. / Navarretta, Costanza.
12th IEEE International Conference on Cognitive Infocommunications: CogInfoCom 2021. IEEE, 2021. s. 429-434.Publikation: Bidrag til bog/antologi/rapport › Konferencebidrag i proceedings › Forskning › fagfællebedømt
Harvard
APA
Vancouver
Author
Bibtex
}
RIS
TY - GEN
T1 - Dialogue Acts and Emotions in Multimodal Dyadic Conversations
AU - Navarretta, Costanza
PY - 2021
Y1 - 2021
N2 - This paper addresses the relation between dialogueacts and emotions in a Danish multimodal annotated corpus offirst encounters. Dialogue acts are semantic generalizations of thecommunicative functions of speech and gestures. Certain emotiontypes have been found to be strongly related to feedback in aprevious study, and therefore we wanted to investigate the relationbetween emotion and dialogue acts further. Our analysis of themost frequently occurring dialogue acts and the co-occurringemotions in the corpus confirms that there is a strong relationbetween some dialogue act types and specific emotion typesand the relation is is not only limited to feedback functions.Two speech segment representations and emotion labels are usedas features in machine learning experiments in which variousclassifiers were trained to identify the 15 most frequent dialogueacts in the data. The results of the experiments show that using thetwo speech segment representations as training data give state of-the art results for dialogue act classification, that exclusivelyrelies on speech segments information. Adding information aboutemotions improves classification when the classifiers are LogisticRegression and a multilayer perceptron.
AB - This paper addresses the relation between dialogueacts and emotions in a Danish multimodal annotated corpus offirst encounters. Dialogue acts are semantic generalizations of thecommunicative functions of speech and gestures. Certain emotiontypes have been found to be strongly related to feedback in aprevious study, and therefore we wanted to investigate the relationbetween emotion and dialogue acts further. Our analysis of themost frequently occurring dialogue acts and the co-occurringemotions in the corpus confirms that there is a strong relationbetween some dialogue act types and specific emotion typesand the relation is is not only limited to feedback functions.Two speech segment representations and emotion labels are usedas features in machine learning experiments in which variousclassifiers were trained to identify the 15 most frequent dialogueacts in the data. The results of the experiments show that using thetwo speech segment representations as training data give state of-the art results for dialogue act classification, that exclusivelyrelies on speech segments information. Adding information aboutemotions improves classification when the classifiers are LogisticRegression and a multilayer perceptron.
M3 - Article in proceedings
SN - 978-1-6654-2495-0
SP - 429
EP - 434
BT - 12th IEEE International Conference on Cognitive Infocommunications
PB - IEEE
ER -
ID: 289023502