Probabilistic Movement Primitives for Coordination of Multiple Human-Robot Collaborative Tasks

Guilherme Maeda , Gerhard Neumann, Marco Ewerton, Rudolf Lioutikov, Oliver Kroemer and Jan Peters
Journal Article, Autonomous Robots (AuRo), January, 2017

Download Publication

Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.


This paper proposes an interaction learning method for collaborative and assistive robots based on movement primitives. The method allows for both action recognition and human-robot movement coordination. It uses imitation learning to construct a mixture model of human-robot interaction primitives. This probabilistic model allows the assistive trajectory of the robot to be inferred from human observations. The method is scalable in relation to the number of tasks and can learn nonlinear correlations between the trajectories that describe the human-robot interaction. We evaluated the method experimentally with a lightweight robot arm in a variety of assistive scenarios, including the coordinated handover of a bottle to a human, and the collaborative assembly of a toolbox. Potential applications of the method are personal caregiver robots, control of intelligent prosthetic devices, and robot coworkers in factories.

author = {Guilherme Maeda and Gerhard Neumann and Marco Ewerton and Rudolf Lioutikov and Oliver Kroemer and Jan Peters},
title = {Probabilistic Movement Primitives for Coordination of Multiple Human-Robot Collaborative Tasks},
journal = {Autonomous Robots (AuRo)},
year = {2017},
month = {January},
} 2019-03-12T14:00:57-04:00