Incremental shape and pose estimation from planar pushing using contact implicit surfaces - Robotics Institute Carnegie Mellon University

Incremental shape and pose estimation from planar pushing using contact implicit surfaces

Sudharshan Suresh, Joshua G. Mangelson, and Michael Kaess
Workshop Paper, ICRA '20 Workshop on Closing the Perception-Action Loop with Vision and Tactile Sensing (ViTac '20), May, 2020

Abstract

Robots need accurate, online estimates of shape and pose while manipulating unknown objects. While vision and depth-based tracking has been well studied [3], they are affected by self-occlusion, cluttered workspaces, and poor visibility. Interestingly—even when blindfolded—humans can infer object properties from local tactile information. Online tactile inference is hard due to the intrusive nature of touch sensing, and initially unknown object models. Recently, Yu et al. [1] formulated this as a batch-SLAM problem, relying on frictional pushing mechanics. However, the method is not built for online tracking, and uses a piecewise-linear discrete shape representation. Incremental, graph-based approaches were later considered, but assume known object model and incorporate vision [2, 4]. A Gaussian process implicit surface (GPIS) shape representation can fuse uncertain measurements in a probabilistic fashion, and is non-parametric—unlike [1]. Dragiev et al. [5] use a GPIS for tactile exploration of fixed pose 3-D objects. To our knowledge, no methods use this representation with online pose estimation for manipulation tasks. In this work, we combine a GPIS shape representation with sparse nonlinear incremental optimization to localize and infer the shape of planar objects. We demonstrate results with simulated tactile exploration of different objects.

BibTeX

@workshop{Suresh-2020-126776,
author = {Sudharshan Suresh and Joshua G. Mangelson and Michael Kaess},
title = {Incremental shape and pose estimation from planar pushing using contact implicit surfaces},
booktitle = {Proceedings of ICRA '20 Workshop on Closing the Perception-Action Loop with Vision and Tactile Sensing (ViTac '20)},
year = {2020},
month = {May},
}