From Situations to Actions: Motion Behavior Learning by Self-Organization

We show that the self-organization principle, implementable in artificial neural networks, is highly useful in connection with autonomous robots. Equipped with a self-organizing controller, a mobile robot can learn sensor-action type of behaviors that are difficult to realize otherwise. It is shown that the sensory information from several sources can be combined such that the obtained representation is directly applicable for higher level operations like navigation and obstacle avoidance. The performance of the approach is demonstrated in two examples: one, where the sensor information guides the robot to move around a corner, and the other, where the robot must navigate between two points avoiding obstacles.