Perceptual Interpretation for Autonomous Navigation through Dynamic Imitation Learning

Achieving high performance autonomous navigation is a central goal of field robotics. Efficient navigation by a mobile robot depends not only on the individual performance of perception and planning systems, but on how well these systems are coupled. When the perception problem is clearly defined, as in well structured environments, this coupling (in the form of a cost function) is also well defined. However, as environments become less structured and more difficult to interpret, more complex cost functions are required, increasing the difficulty of their design. Recently, a class of machine learning techniques has been developed that rely upon expert demonstration to develop a function mapping perceptual data to costs. These algorithms choose the cost function such that the robot’s planned behavior mimics an expert’s demonstration as closely as possible. In this work, we extend these methods to address the challenge of dynamic and incomplete online perceptual data, as well as noisy and imperfect expert demonstration. We validate our approach on a large scale outdoor robot with hundreds of kilometers of autonomous navigation through complex natural terrains.

[1]  Dean Pomerleau,et al.  ALVINN, an autonomous land vehicle in a neural network , 2015 .

[2]  Yann LeCun,et al.  Off-Road Obstacle Avoidance through End-to-End Learning , 2005, NIPS.

[3]  R. E. Kalman,et al.  When Is a Linear Control System Optimal , 1964 .

[4]  Anthony Stentz,et al.  Online adaptive rough-terrain navigation vegetation , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[5]  David Silver,et al.  Applied Imitation Learning for Autonomous Navigation in Complex Natural Terrain , 2009, FSR.

[6]  C.A. Brooks,et al.  Self-Supervised Classification for Planetary Rover Terrain Sensing , 2007, 2007 IEEE Aerospace Conference.

[7]  Martial Hebert,et al.  Path planning with hallucinated worlds , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[8]  John J. Leonard,et al.  Cooperative AUV Navigation Using a Single Surface Craft , 2009, FSR.

[9]  James M. Rehg,et al.  Traversability classification using unsupervised on-line visual learning for outdoor robot navigation , 2006, Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006..

[10]  David Silver,et al.  Learning to search: Functional gradient techniques for imitation learning , 2009, Auton. Robots.

[11]  Larry Matthies,et al.  Stereo vision and rover navigation software for planetary exploration , 2002, Proceedings, IEEE Aerospace Conference.

[12]  J. Andrew Bagnell,et al.  Maximum margin planning , 2006, ICML.

[13]  Sebastian Thrun,et al.  Apprenticeship learning for motion planning with application to parking lot navigation , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[14]  Homayoun Seraji,et al.  Vision-based terrain characterization and traversability assessment , 2001, J. Field Robotics.

[15]  Reid G. Simmons,et al.  Recent progress in local and global traversability for planetary rovers , 2000, Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065).

[16]  Oliver Brock,et al.  High Performance Outdoor Navigation from Overhead Data using Imitation Learning , 2009 .

[17]  David M. Bradley,et al.  Boosting Structured Prediction for Imitation Learning , 2006, NIPS.

[18]  Pieter Abbeel,et al.  Apprenticeship learning via inverse reinforcement learning , 2004, ICML.

[19]  Anthony Stentz,et al.  The Crusher System for Autonomous Navigation , 2007 .