Perceptual Interpretation for Autonomous Navigation through Dynamic Imitation Learning David Silver J. Andrew Bagnell Anthony Stentz 10.1184/R1/6557435.v1 https://kilthub.cmu.edu/articles/journal_contribution/Perceptual_Interpretation_for_Autonomous_Navigation_through_Dynamic_Imitation_Learning/6557435 <p>Achieving high performance autonomous navigation is a central goal of field robotics. Efficient navigation by a mobile robot depends not only on the individual performance of perception and planning systems, but on how well these systems are coupled. When the perception problem is clearly defined, as in well structured environments, this coupling (in the form of a cost function) is also well defined. However, as environments become less structured and more difficult to interpret, more complex cost functions are required, increasing the difficulty of their design. Recently, a class of machine learning techniques has been developed that rely upon expert demonstration to develop a function mapping perceptual data to costs. These algorithms choose the cost function such that the robot’s planned behavior mimics an expert’s demonstration as closely as possible. In this work, we extend these methods to address the challenge of dynamic and incomplete online perceptual data, as well as noisy and imperfect expert demonstration. We validate our approach on a large scale outdoor robot with hundreds of kilometers of autonomous navigation through complex natural terrains.</p> 2009-08-01 00:00:00 Robotics