In-Hand Object Pose Tracking via Contact Feedback and GPU-Accelerated Robotic Simulation - Robotics Institute Carnegie Mellon University

In-Hand Object Pose Tracking via Contact Feedback and GPU-Accelerated Robotic Simulation

Jacky Liang, Ankur Handa, Karl Van Wyk, Viktor Makoviychuk, Oliver Kroemer, and Dieter Fox
Conference Paper, Proceedings of (ICRA) International Conference on Robotics and Automation, pp. 6203 - 6209, May, 2020

Abstract

Tracking the pose of an object while it is being held and manipulated by a robot hand is difficult for vision based methods due to significant occlusions. Prior works have explored using contact feedback and particle filters to localize in-hand objects. However, they have mostly focused on the static grasp setting and not when the object is in motion, as doing so requires modeling of complex contact dynamics. In this work, we propose using GPU-accelerated parallel robot simulations and derivative-free, sample-based optimizers to track in-hand object poses with contact feedback during manipulation. We use physics simulation as the forward model for robot-object interactions, and the algorithm jointly optimizes for the state and the parameters of the simulations, so they better match with those of the real world. Our method runs in real-time (30Hz) on a single GPU, and it achieves an average point cloud distance error of 6mm in simulation experiments and 13mm in the real-world ones.

BibTeX

@conference{Liang-2020-119459,
author = {Jacky Liang and Ankur Handa and Karl Van Wyk and Viktor Makoviychuk and Oliver Kroemer and Dieter Fox},
title = {In-Hand Object Pose Tracking via Contact Feedback and GPU-Accelerated Robotic Simulation},
booktitle = {Proceedings of (ICRA) International Conference on Robotics and Automation},
year = {2020},
month = {May},
pages = {6203 - 6209},
keywords = {Pose Tracking, Contact Feedback, Sim2Real},
}