Towards Safe Reinforcement Learning in the Real World - Robotics Institute Carnegie Mellon University

Towards Safe Reinforcement Learning in the Real World

Master's Thesis, Tech. Report, CMU-RI-TR-19-56, Robotics Institute, Carnegie Mellon University, July, 2019

Abstract

Control for mobile robots in slippery, rough terrain at high speeds is difficult. One approach to designing controllers for complex, non-uniform dynamics in unstructured environments is to use model-free learning-based methods. However, these methods often lack the necessary notion of safety which is needed to deploy these controllers without danger, and hence have rarely been tested successfully on real world tasks. In this work, we present methods and techniques that allow model-free learning-based methods to learn low-level controllers for mobile robot navigation while minimizing violations to user-defined safety constraints. We show these learned controllers working robustly in both simulation and in the real world on a 1:10 scale RC car, as well as a full-size vehicle called the MRZR.

BibTeX

@mastersthesis{Ahn-2019-117213,
author = {Edward Ahn},
title = {Towards Safe Reinforcement Learning in the Real World},
year = {2019},
month = {July},
school = {Carnegie Mellon University},
address = {Pittsburgh, PA},
number = {CMU-RI-TR-19-56},
}