The effect of robot gaze on processing robot utterances

Abstract

Gaze during situated language production and comprehension is tightly coupled with the unfolding speech stream in a manner that provides interlocutors with relevant on-line information about both what we intend to say and what we have understood. This paper investigates whether people similarly exploit such gaze when listening to a robot make statements about the shared visual environment. On the basis of two eye-tracking experiments exploiting different tasks, we report evidence (a) that people's visual attention is influenced on-line by both the robot's gaze and speech, (b) that congruent gaze (to mentioned objects) facilitates comprehension, and (c) that robot gaze does indeed influence what listeners think the robot intended. This supports the view that spoken interaction with artificial agents such as robots benefits when those agents exhibit cognitively-derived real-time speech-mediated attention behaviour.


Back to Thursday Posters