Big Data and Multimodal Communication: A Perspective View

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Standard

Big Data and Multimodal Communication: A Perspective View. / Navarretta, Costanza; Oemig, Lucretia.

I: Intelligent Systems Reference Library, Bind 159, 2019, s. 167-184.

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Harvard

Navarretta, C & Oemig, L 2019, 'Big Data and Multimodal Communication: A Perspective View', Intelligent Systems Reference Library, bind 159, s. 167-184. https://doi.org/10.1007/978-3-030-15939-9_9

APA

Navarretta, C., & Oemig, L. (2019). Big Data and Multimodal Communication: A Perspective View. Intelligent Systems Reference Library, 159, 167-184. https://doi.org/10.1007/978-3-030-15939-9_9

Vancouver

Navarretta C, Oemig L. Big Data and Multimodal Communication: A Perspective View. Intelligent Systems Reference Library. 2019;159:167-184. https://doi.org/10.1007/978-3-030-15939-9_9

Author

Navarretta, Costanza ; Oemig, Lucretia. / Big Data and Multimodal Communication: A Perspective View. I: Intelligent Systems Reference Library. 2019 ; Bind 159. s. 167-184.

Bibtex

@article{bc51b177747e4dbb91568dc2cc691aca,
title = "Big Data and Multimodal Communication: A Perspective View",
abstract = "Humans communicate face-to-face through at least two modalities, the auditive modality, speech, and the visual modality, gestures, which comprise e.g. gaze movements, facial expressions, head movements, and hand gestures. The relation between speech and gesture is complex and partly depends on factors such as the culture, the communicative situation, the interlocutors and their relation. Investigating these factors in real data is vital for studying multimodal communication and building models for implementing natural multimodal communicative interfaces able to interact naturally with individuals of different age, culture, and needs. In this paper, we discuss to what extent big data “in the wild”, which are growing explosively on the internet, are useful for this purpose also in light of legal aspects about the use of personal data, comprising multimodal data downloaded from social media.",
author = "Costanza Navarretta and Lucretia Oemig",
note = "Anna Esposito, Antonietta M. Esposito & Lakhmi C. Jain (eds.): Innovations in Big Data Mining and Embedded Knowledge. ISBN 978-3-030-15938-2; ISBN 978-3-030-15939-9.",
year = "2019",
doi = "10.1007/978-3-030-15939-9_9",
language = "English",
volume = "159",
pages = "167--184",
journal = "Intelligent Systems Reference Library",
issn = "1868-4394",
publisher = "Springer Science+Business Media",

}

RIS

TY - JOUR

T1 - Big Data and Multimodal Communication: A Perspective View

AU - Navarretta, Costanza

AU - Oemig, Lucretia

N1 - Anna Esposito, Antonietta M. Esposito & Lakhmi C. Jain (eds.): Innovations in Big Data Mining and Embedded Knowledge. ISBN 978-3-030-15938-2; ISBN 978-3-030-15939-9.

PY - 2019

Y1 - 2019

N2 - Humans communicate face-to-face through at least two modalities, the auditive modality, speech, and the visual modality, gestures, which comprise e.g. gaze movements, facial expressions, head movements, and hand gestures. The relation between speech and gesture is complex and partly depends on factors such as the culture, the communicative situation, the interlocutors and their relation. Investigating these factors in real data is vital for studying multimodal communication and building models for implementing natural multimodal communicative interfaces able to interact naturally with individuals of different age, culture, and needs. In this paper, we discuss to what extent big data “in the wild”, which are growing explosively on the internet, are useful for this purpose also in light of legal aspects about the use of personal data, comprising multimodal data downloaded from social media.

AB - Humans communicate face-to-face through at least two modalities, the auditive modality, speech, and the visual modality, gestures, which comprise e.g. gaze movements, facial expressions, head movements, and hand gestures. The relation between speech and gesture is complex and partly depends on factors such as the culture, the communicative situation, the interlocutors and their relation. Investigating these factors in real data is vital for studying multimodal communication and building models for implementing natural multimodal communicative interfaces able to interact naturally with individuals of different age, culture, and needs. In this paper, we discuss to what extent big data “in the wild”, which are growing explosively on the internet, are useful for this purpose also in light of legal aspects about the use of personal data, comprising multimodal data downloaded from social media.

U2 - 10.1007/978-3-030-15939-9_9

DO - 10.1007/978-3-030-15939-9_9

M3 - Journal article

VL - 159

SP - 167

EP - 184

JO - Intelligent Systems Reference Library

JF - Intelligent Systems Reference Library

SN - 1868-4394

ER -

ID: 223923919