An Acoustic Analysis of Child-Child and Child-Robot Interactions for Understanding Engagement during Speech -Controlled Computer Games Conference Paper uri icon

abstract

  • Copyright 2016 ISCA. Engagement is an essential factor towards successful game design and effective human-computer interaction. We analyze the prosodic patterns of child-child and child-robot pairs playing a language-based computer game. Acoustic features include speech loudness and fundamental frequency. We use a linear mixed-effects model to capture the coordination of acoustic patterns between interactors as well as its relation to annotated engagement levels. Our results indicate that the considered acoustic features are related to engagement levels for both the childchild and child-robot interaction. They further suggest significant association of the prosodic patterns during the child-child scenario, which is moderated by the co-occurring engagement. This acoustic coordination is not present in the child-robot interaction, since the robot's behavior was not automatically adjusted to the child. These findings are discussed in relation to automatic robot adaptation and provide a foundation for promoting engagement and enhancing rapport during the considered game-based interactions.

name of conference

  • Interspeech 2016

published proceedings

  • 17TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2016), VOLS 1-5

author list (cited authors)

  • Chaspari, T., & Lehman, J. F.

citation count

  • 6

complete list of authors

  • Chaspari, Theodora||Lehman, Jill Fain

publication date

  • January 2016