Nonverbal Robot Feedback for Human Teachers - Robotics Institute Carnegie Mellon University

Nonverbal Robot Feedback for Human Teachers

Sandy H. Huang, Isabella Huang, Ravi Pandya, and Anca D. Dragan
Conference Paper, Proceedings of (CoRL) Conference on Robot Learning, pp. 1038 - 1051, October, 2019

Abstract

Robots can learn preferences from human demonstrations, but their success depends on how informative these demonstrations are. Being informative is unfortunately very challenging, because during teaching, people typically get no transparency into what the robot already knows or has learned so far. In contrast, human students naturally provide a wealth of nonverbal feedback that reveals their level of understanding and engagement. In this work, we study how a robot can similarly provide feedback that is minimally disruptive, yet gives human teachers a better mental model of the robot learner, and thus enables them to teach more effectively. Our idea is that at any point, the robot can indicate what it thinks the correct next action is, shedding light on its current estimate of the human’s preferences. We analyze how useful this feedback is, both in theory and with two user studies—one with a virtual character that tests the feedback itself, and one with a PR2 robot that uses gaze as the feedback mechanism. We find that feedback can be useful for improving both the quality of teaching and teachers’ understanding of the robot’s capability.
Cite this Paper

BibTeX

@conference{Huang-2019-126738,
author = {Sandy H. Huang and Isabella Huang and Ravi Pandya and Anca D. Dragan},
title = {Nonverbal Robot Feedback for Human Teachers},
booktitle = {Proceedings of (CoRL) Conference on Robot Learning},
year = {2019},
month = {October},
pages = {1038 - 1051},
}