Accurate and flexible simulation for dynamic, vision-centric robots - Robotics Institute Carnegie Mellon University

Accurate and flexible simulation for dynamic, vision-centric robots

Jared Go, Brett Browning, and Manuela Veloso
Conference Paper, Proceedings of 3rd International Joint Conference on Autonomous Agents and MultiAgent Systems (AAMAS '04), Vol. 3, pp. 1388 - 1389, July, 2004

Abstract

As robots become more complex by incorporating dynamic stability or greater mechanical degrees of freedom, the difficulty of developing control algorithms directly on the robot increases. This is especially true for large or expensive robots, where damage is costly, or where communication bandwidth limits in-depth debugging. One effective solution to this problem is the use of a flexible, physically-accurate simulation environment which allows for experimentation with the physical composition and control systems of one or more robots in a controlled virtual setting. While many robot simulation environments are available today, we find that achieving accurate simulation of complex, vision-centric platforms such as the Segway RMP or Sony AIBO requires accurate modeling of latency and robust synchronization. Building on our previous work, we present an open-source simulation framework, UberSim, and demonstrate its ability to simulate vision-centric, balancing robots in a realistic fashion. The focus of this simulation environment is on accurate simulation with high-frequency control loops and flexible configuration of robot structure and parameters via a client-side definition language.

BibTeX

@conference{Go-2004-16924,
author = {Jared Go and Brett Browning and Manuela Veloso},
title = {Accurate and flexible simulation for dynamic, vision-centric robots},
booktitle = {Proceedings of 3rd International Joint Conference on Autonomous Agents and MultiAgent Systems (AAMAS '04)},
year = {2004},
month = {July},
volume = {3},
pages = {1388 - 1389},
}