Dialogue Acts and Emotions in Multimodal Dyadic Conversations

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Standard

Dialogue Acts and Emotions in Multimodal Dyadic Conversations. / Navarretta, Costanza.

12th IEEE International Conference on Cognitive Infocommunications: CogInfoCom 2021. IEEE, 2021. p. 429-434.

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Harvard

Navarretta, C 2021, Dialogue Acts and Emotions in Multimodal Dyadic Conversations. in 12th IEEE International Conference on Cognitive Infocommunications: CogInfoCom 2021. IEEE, pp. 429-434.

APA

Navarretta, C. (2021). Dialogue Acts and Emotions in Multimodal Dyadic Conversations. In 12th IEEE International Conference on Cognitive Infocommunications: CogInfoCom 2021 (pp. 429-434). IEEE.

Vancouver

Navarretta C. Dialogue Acts and Emotions in Multimodal Dyadic Conversations. In 12th IEEE International Conference on Cognitive Infocommunications: CogInfoCom 2021. IEEE. 2021. p. 429-434

Author

Navarretta, Costanza. / Dialogue Acts and Emotions in Multimodal Dyadic Conversations. 12th IEEE International Conference on Cognitive Infocommunications: CogInfoCom 2021. IEEE, 2021. pp. 429-434

Bibtex

@inproceedings{119d5007be80448d8cc65c3bf7922bac,
title = "Dialogue Acts and Emotions in Multimodal Dyadic Conversations",
abstract = "This paper addresses the relation between dialogueacts and emotions in a Danish multimodal annotated corpus offirst encounters. Dialogue acts are semantic generalizations of thecommunicative functions of speech and gestures. Certain emotiontypes have been found to be strongly related to feedback in aprevious study, and therefore we wanted to investigate the relationbetween emotion and dialogue acts further. Our analysis of themost frequently occurring dialogue acts and the co-occurringemotions in the corpus confirms that there is a strong relationbetween some dialogue act types and specific emotion typesand the relation is is not only limited to feedback functions.Two speech segment representations and emotion labels are usedas features in machine learning experiments in which variousclassifiers were trained to identify the 15 most frequent dialogueacts in the data. The results of the experiments show that using thetwo speech segment representations as training data give state of-the art results for dialogue act classification, that exclusivelyrelies on speech segments information. Adding information aboutemotions improves classification when the classifiers are LogisticRegression and a multilayer perceptron.",
author = "Costanza Navarretta",
year = "2021",
language = "English",
isbn = "978-1-6654-2495-0",
pages = "429--434",
booktitle = "12th IEEE International Conference on Cognitive Infocommunications",
publisher = "IEEE",

}

RIS

TY - GEN

T1 - Dialogue Acts and Emotions in Multimodal Dyadic Conversations

AU - Navarretta, Costanza

PY - 2021

Y1 - 2021

N2 - This paper addresses the relation between dialogueacts and emotions in a Danish multimodal annotated corpus offirst encounters. Dialogue acts are semantic generalizations of thecommunicative functions of speech and gestures. Certain emotiontypes have been found to be strongly related to feedback in aprevious study, and therefore we wanted to investigate the relationbetween emotion and dialogue acts further. Our analysis of themost frequently occurring dialogue acts and the co-occurringemotions in the corpus confirms that there is a strong relationbetween some dialogue act types and specific emotion typesand the relation is is not only limited to feedback functions.Two speech segment representations and emotion labels are usedas features in machine learning experiments in which variousclassifiers were trained to identify the 15 most frequent dialogueacts in the data. The results of the experiments show that using thetwo speech segment representations as training data give state of-the art results for dialogue act classification, that exclusivelyrelies on speech segments information. Adding information aboutemotions improves classification when the classifiers are LogisticRegression and a multilayer perceptron.

AB - This paper addresses the relation between dialogueacts and emotions in a Danish multimodal annotated corpus offirst encounters. Dialogue acts are semantic generalizations of thecommunicative functions of speech and gestures. Certain emotiontypes have been found to be strongly related to feedback in aprevious study, and therefore we wanted to investigate the relationbetween emotion and dialogue acts further. Our analysis of themost frequently occurring dialogue acts and the co-occurringemotions in the corpus confirms that there is a strong relationbetween some dialogue act types and specific emotion typesand the relation is is not only limited to feedback functions.Two speech segment representations and emotion labels are usedas features in machine learning experiments in which variousclassifiers were trained to identify the 15 most frequent dialogueacts in the data. The results of the experiments show that using thetwo speech segment representations as training data give state of-the art results for dialogue act classification, that exclusivelyrelies on speech segments information. Adding information aboutemotions improves classification when the classifiers are LogisticRegression and a multilayer perceptron.

M3 - Article in proceedings

SN - 978-1-6654-2495-0

SP - 429

EP - 434

BT - 12th IEEE International Conference on Cognitive Infocommunications

PB - IEEE

ER -

ID: 289023502