Data-driven model of nonverbal behavior for socially assistive human-robot interactions - Robotics Institute Carnegie Mellon University

Data-driven model of nonverbal behavior for socially assistive human-robot interactions

Henny Admoni and Brian Scassellati
Conference Paper, Proceedings of 16th International Conference on Multimodal Interaction (ICMI '14), pp. 196 - 199, November, 2014

Abstract

Socially assistive robotics (SAR) aims to develop robots that help people through interactions that are inherently social, such as tutoring and coaching. For these interactions to be effective, socially assistive robots must be able to recognize and use nonverbal social cues like eye gaze and gesture. In this paper, we present a preliminary model for nonverbal robot behavior in a tutoring application. Using empirical data from teachers and students in human-human tutoring interactions, the model can be both predictive (recognizing the context of new nonverbal behaviors) and generative (creating new robot nonverbal behaviors based on a desired context) using the same underlying data representation.

BibTeX

@conference{Admoni-2014-113239,
author = {Henny Admoni and Brian Scassellati},
title = {Data-driven model of nonverbal behavior for socially assistive human-robot interactions},
booktitle = {Proceedings of 16th International Conference on Multimodal Interaction (ICMI '14)},
year = {2014},
month = {November},
pages = {196 - 199},
}