Accurate and Flxible Simulation for Dynamic, Vision-Centric Robots
journal contributionposted on 01.01.2004, 00:00 authored by Jared Go, Brett Browning, Manuela M. Veloso
As robots become more complex by incorporating dynamic stability or greater mechanical degrees of freedom, the difficulty of developing control algorithms directly on the robot increases. This is especially true for large or expensive robots, where damage is costly, or where communication bandwidth limits in-depth debugging. One effective solution to this problem is the use of a flexible, physically-accurate simulation environment which allows for experimentation with the physical composition and control systems of one or more robots in a controlled virtual setting. While many robot simulation environments are available today, we find that achieving accurate simulation of complex, vision-centric platforms such as the Segway RMP or Sony AIBO requires accurate modeling of latency and robust synchronization. Building on our previous work, we present an open-source simulation framework, ÜberSim, and demonstrate its ability to simulate vision-centric, balancing robots in a realistic fashion. The focus of this simulation environment is on accurate simulation with high-frequency control loops and flexible configuration of robot structure and parameters via a client-side definition language.