PatchGraph: In-hand Tactile Tracking with Learned Surface Normals - Robotics Institute Carnegie Mellon University

PatchGraph: In-hand Tactile Tracking with Learned Surface Normals

Conference Paper, Proceedings of (ICRA) International Conference on Robotics and Automation, pp. 2164 - 2170, May, 2022

Abstract

We address the problem of tracking 3D object poses from touch during in-hand manipulations. Specifically, we look at tracking small objects using vision-based tactile sensors that provide high-dimensional tactile image measurements at the point of contact. While prior work has relied on a-priori information about the object being localized, we remove this requirement. Our key insight is that an object is composed of several local surface patches, each informative enough to achieve reliable object tracking. Moreover, we can recover the geometry of this local patch online by extracting local surface normal information embedded in each tactile image. We propose a novel two-stage approach. First, we learn a mapping from tactile images to surface normals using an image translation network. Second, we use these surface normals within a factor graph to both reconstruct a local patch map and use it to infer 3D object poses. We demonstrate reliable object tracking for over 100 contact sequences across unique shapes with four objects in simulation and two objects in the real-world.

BibTeX

@conference{Sodhi-2022-134119,
author = {Paloma Sodhi and Michael Kaess and Mustafa Mukadam and Stuart Anderson},
title = {PatchGraph: In-hand Tactile Tracking with Learned Surface Normals},
booktitle = {Proceedings of (ICRA) International Conference on Robotics and Automation},
year = {2022},
month = {May},
pages = {2164 - 2170},
}