TRR 318 - Contextualized and online parametrization of attention in human–robot explanatory dialog (Subproject A05)

Overview

In Project A05, researchers from the areas of linguistics, psychology, and computer science are investigating attention in human-robot explanatory dialog. They are addressing questions including: where do humans focus their attention when a robot explains a task to them? How can robots direct their counterpart’s attention to achieve the goal of the task? What influence does it have on overall understanding? To investigate these questions, the researchers are assessing attention during an interactive human robot explanatory dialog. The researchers are also aiming to investigate the conditions under which certain linguistic formulations, such as “do X and not Y”, might influence the nature of explanation and understanding. Based on these results, the researchers will then draw conclusions about how robots can understand and direct the attention of their human counterparts and subsequently generate interpretable explanations.

Key Facts

Grant Number:
438445824
Project type:
Research
Project duration:
07/2021 - 06/2025
Funded by:
DFG
Website:
Homepage

More Information

Principal Investigators

contact-box image

Prof. Dr. Ingrid Scharlau

Kognitive Psychologie und Psychologiedidaktik

About the person
contact-box image

Prof. Dr. Katharina Rohlfing

Key research area Transformation and Education

About the person
contact-box image

Britta Wrede

Universit?t Bielefeld

About the person (Orcid.org)

Project Team

contact-box image

Ngoc Chi Banh, M.Sc.

Kognitive Psychologie und Psychologiedidaktik

About the person
contact-box image

Amit Singh, M.Sc.

Transregional Collaborative Research Centre 318

About the person
contact-box image

André Gro?

Universit?t Bielefeld

About the person (Orcid.org)

Cooperating Institutions

Universit?t Bielefeld

Cooperating Institution

Publications

Coupling of Task and Partner Model: Investigating the Intra-Individual Variability in Gaze during Human–Robot Explanatory Dialogue
A. Singh, K.J. Rohlfing, in: Proceedings of 26th ACM International Conference on Multimodal Interaction (ICMI 2024), 2024.
RISE: an open-source architecture for interdisciplinary and reproducible human–robot interaction research
A. Gro?, C. Schütze, M. Brandt, B. Wrede, B. Richter, Frontiers in Robotics and AI 10 (2023).
EEG Correlates of Distractions and Hesitations in Human–Robot Interaction: A LabLinking Pilot Study
B. Richter, F. Putze, G. Ivucic, M. Brandt, C. Schütze, R. Reisenhofer, B. Wrede, T. Schultz, Multimodal Technologies and Interaction 7 (2023).
First steps towards real-time assessment of attentional weights and capacity according to TVA
N.C. Banh, I. Scharlau, in: S. Merz, C. Frings, B. Leuchtenberg, B. Moeller, S. Mueller, R. Neumann, B. Past?tter, L. Pingen, G. Schui (Eds.), Abstracts of the 65th TeaP, ZPID (Leibniz Institute for Psychology), 2023.
Show all publications