Leveraging Multimodal Haptic Sensory Data for Robust Cutting - Robotics Institute Carnegie Mellon University

Leveraging Multimodal Haptic Sensory Data for Robust Cutting

Kevin Zhang, Mohit Sharma, Manuela Veloso, and Oliver Kroemer
Conference Paper, Proceedings of IEEE-RAS International Conference on Humanoid Robots (Humanoids '19), pp. 409 - 416, October, 2019

Abstract

Cutting is a common form of manipulation when working with divisible objects such as food, rope, or clay. Cooking in particular relies heavily on cutting to divide food items into desired shapes. However, cutting food is a challenging task due to the wide range of material properties exhibited by food items. Due to this variability, the same cutting motions cannot be used for all food items. Sensations from contact events, e.g., when placing the knife on the food item, will also vary depending on the material properties, and the robot will need to adapt accordingly.

In this paper, we propose using vibrations and force-torque feedback from the interactions to adapt the slicing motions and monitor for contact events. The robot learns neural networks for performing each of these tasks and generalizing across different material properties. By adapting and monitoring the skill executions, the robot is able to reliably cut through more than 20 different types of food items and even detect whether certain food items are fresh or old.

BibTeX

@conference{Zhang-2019-118480,
author = {Kevin Zhang and Mohit Sharma and Manuela Veloso and Oliver Kroemer},
title = {Leveraging Multimodal Haptic Sensory Data for Robust Cutting},
booktitle = {Proceedings of IEEE-RAS International Conference on Humanoid Robots (Humanoids '19)},
year = {2019},
month = {October},
pages = {409 - 416},
}