Noise maps for acoustically sensitive navigation

More and more robotic applications are equipping robots with microphones to improve the sensory information available to them. However, in most applications the auditory task is very low-level, only processing data and providing auditory event information to higher-level navigation routines. If the robot, and therefore the microphone, ends up in a bad acoustic location, then the results from that sensor will remain noisy and potentially useless for accomplishing the required task. To solve this problem, there are at least two possible solutions. The first is to provide bigger and more complex filters, which is the traditional signal processing approach. An alternative solution is to move the robot in concert with providing better audition. In this work, the second approach is followed by introducing noise maps as a tool for acoustically sensitive navigation. A noise map is a guide to noise in the environment, pinpointing locations which would most likely interfere with auditory sensing. A traditional noise map, in an acoustic sense, is a graphical display of the average sound pressure level at any given location. An area with high sound pressure level corresponds to high ambient noise that could interfere with an auditory application. Such maps can be either created by hand, or by allowing the robot to first explore the environment. Converted into a potential field, a noise map then becomes a useful tool for reducing the interference from ambient noise. Preliminary results with a real robot on the creation and use of noise maps are presented.

[1]  Ronald C. Arkin,et al.  An Behavior-based Robotics , 1998 .

[2]  Illah R. Nourbakhsh,et al.  A survey of socially interactive robots , 2003, Robotics Auton. Syst..

[3]  Hiroaki Kitano,et al.  Epipolar geometry based sound localization and extraction for humanoid audition , 2001, Proceedings 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems. Expanding the Societal Role of Robotics in the the Next Millennium (Cat. No.01CH37180).

[4]  Noboru Ohnishi,et al.  Mobile robot and sound localization , 1997, Proceedings of the 1997 IEEE/RSJ International Conference on Intelligent Robot and Systems. Innovative Robotics for Real-World Applications. IROS '97.

[5]  Lynne E. Parker,et al.  Heterogeneous mobile sensor net deployment using robot herding and line-of-sight formations , 2003, Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003) (Cat. No.03CH37453).

[6]  Daniel R. Raichel The science and applications of acoustics , 2000 .

[7]  Berthold Hedwig,et al.  Complex auditory behaviour emerges from simple reactive steering , 2004, Nature.

[8]  M. A. Yoder,et al.  Dsp First: A Multimedia Approach , 1997 .

[9]  Barbara Webb Robots, crickets and ants: models of neural control of chemotaxis and phonotaxis , 1998, Neural Networks.

[10]  Nobuyuki Yamasaki,et al.  Active interface for human-robot interaction , 1995, Proceedings of 1995 IEEE International Conference on Robotics and Automation.

[11]  Stuart H. Young,et al.  Robotic vehicle uses acoustic array for detection and localization in urban environments , 2001, SPIE Defense + Commercial Sensing.

[12]  Alan C. Schultz,et al.  Using a natural language and gesture interface for unmanned vehicles , 2000, Defense, Security, and Sensing.

[13]  Tucker R. Balch,et al.  Avoiding the past: a simple but effective strategy for reactive navigation , 1993, [1993] Proceedings IEEE International Conference on Robotics and Automation.