Robot Learning for Manipulation of Granular Materials Using Vision and Sound

Samuel Clarke
Master's Thesis, Tech. Report, CMU-RI-TR-19-68, June, 2019

Download Publication

Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.

Abstract

Granular materials are ubiquitous in household and industrial manipulation tasks, but their dynamics are difficult to model analytically or through simulation. During manipulation, they provide rich multimodal sensory feedback. We present a robotic system we constructed for investigating manipulation of granular materials.

We present two data-driven, learning-based frameworks to control scooping and pouring granular materials. In our first set of experiments, we focus on the task of scooping granular materials and propose to learn a model of relevant granular material dynamics through a data-driven approach based on neural networks. In our second set of experiments, we demonstrate a novel framework for using audio feedback during manipulation of granular materials.


@mastersthesis{Clarke-2019-117117,
author = {Samuel Clarke},
title = {Robot Learning for Manipulation of Granular Materials Using Vision and Sound},
year = {2019},
month = {June},
school = {},
address = {Pittsburgh, PA},
number = {CMU-RI-TR-19-68},
keywords = {machine learning, audio, vibrations, granular materials, robotics, manipulation, vision, sound},
} 2019-08-12T08:59:06-04:00