TRR 318 - Monitoring the understanding of explanations (Subproject A02)

Overview

When something is being explained to someone, the explainee signals their understanding – or lack thereof – to the explainer with verbal expressions and other non-verbal means of communication, such as gestures and facial expressions. By nodding, the explainee can signal that they have understood. Nodding, however, can also be meant as a request to continue with the explanation. This has to be determined from the context of the conversation. In Project A02, linguists and computational linguists are investigating how people (and later, artificial agents) recognize that the person they’re explaining something to is understanding – or not. For this, the research team will be looking at 80 dialogues in which one person explains a social game to another, examining these for communicative feedback signals that indicate varying degrees of comprehension in the process of understanding. The findings from these analyses will be incorporated into an intelligent system that will be able to detect feedback signals such as head nods and interpret them in terms of signaled level of understanding.

Key Facts

Grant Number:
438445824
Project type:
Research
Project duration:
07/2021 - 06/2025
Funded by:
DFG
Website:
Homepage

More Information

Principal Investigators

contact-box image

Dr. Angela Grimminger

Germanistische und Allgemeine Sprachwissenschaft

About the person
contact-box image

Hendrik Buschmeier

Universit?t Bielefeld

About the person (Orcid.org)
contact-box image

Petra Wagner

Universit?t Bielefeld

About the person (Orcid.org)

Project Team

contact-box image

Stefan Lazarov, M.A.

Transregional Collaborative Research Centre 318

About the person
contact-box image

Olcay Türk

Universit?t Bielefeld

About the person (Orcid.org)
contact-box image

Yu Wang

Universit?t Bielefeld

Cooperating Institutions

Universit?t Bielefeld

Cooperating Institution

Publications

How much does nonverbal communication conform to entropy rate constancy?: A case study on listener gaze in interaction
Y. Wang, Y. Xu, G. Skantze, H. Buschmeier, in: Findings of the Association for Computational Linguistics ACL 2024, Bangkok, Thailand, 2024, pp. 3533–3545.
Turn-taking dynamics across different phases of explanatory dialogues
P. Wagner, M. W?odarczak, H. Buschmeier, O. Türk, E. Gilmartin, in: Proceedings of the 28th 365体育_足球比分网¥投注直播官网 on the Semantics and Pragmatics of Dialogue, Trento, Italy, 2024, pp. 6–14.
Conversational feedback in scripted versus spontaneous dialogues: A comparative analysis
I. Pilán, L. Prévot, H. Buschmeier, P. Lison, in: Proceedings of the 25th Meeting of the Special Interest Group on Discourse and Dialogue, Kyoto, Japan, 2024, pp. 440–457.
Towards a Computational Architecture for Co-Constructive Explainable Systems
M. Booshehri, H. Buschmeier, P. Cimiano, S. Kopp, J. Kornowicz, O. Lammert, M. Matarese, D. Mindlin, A.S. Robrecht, A.-L. Vollmer, P. Wagner, B. Wrede, in: Proceedings of the 2024 365体育_足球比分网¥投注直播官网 on Explainability Engineering, ACM, 2024, pp. 20–25.
Automatic reconstruction of dialogue participants’ coordinating gaze behavior from multiple camera perspectives
A.N. Riechmann, H. Buschmeier, in: Book of Abstracts of the 2nd International Multimodal Communication Symposium, Frankfurt am Main, Germany, 2024, pp. 38–39.
Show all publications