Robot Learning for Manipulation of Granular Materials Using Vision and Sound - Robotics Institute Carnegie Mellon University

Robot Learning for Manipulation of Granular Materials Using Vision and Sound

Master's Thesis, Tech. Report, CMU-RI-TR-19-68, Robotics Institute, Carnegie Mellon University, June, 2019

Abstract

Granular materials are ubiquitous in household and industrial manipulation tasks, but their dynamics are difficult to model analytically or through simulation. During manipulation, they provide rich multimodal sensory feedback. We present a robotic system we constructed for investigating manipulation of granular materials.

We present two data-driven, learning-based frameworks to control scooping and pouring granular materials. In our first set of experiments, we focus on the task of scooping granular materials and propose to learn a model of relevant granular material dynamics through a data-driven approach based on neural networks. In our second set of experiments, we demonstrate a novel framework for using audio feedback during manipulation of granular materials.

BibTeX

@mastersthesis{Clarke-2019-117117,
author = {Samuel Clarke},
title = {Robot Learning for Manipulation of Granular Materials Using Vision and Sound},
year = {2019},
month = {June},
school = {Carnegie Mellon University},
address = {Pittsburgh, PA},
number = {CMU-RI-TR-19-68},
keywords = {machine learning, audio, vibrations, granular materials, robotics, manipulation, vision, sound},
}