A SLAM Based Semantic Indoor Navigation System for Visually Impaired Users

This paper proposes a novel assistive navigation system based on simultaneous localization and mapping (SLAM) and semantic path planning to help visually impaired users navigate in indoor environments. The system integrates multiple wearable sensors and feedback devices including a RGB-D sensor and an inertial measurement unit (IMU) on the waist, a head mounted camera, a microphone and an earplug/speaker. We develop a visual odometry algorithm based on RGB-D data to estimate the user's position and orientation, and refine the orientation error using the IMU. We employ the head mounted camera to recognize the door numbers and the RGB-D sensor to detect major landmarks such as corridor corners. By matching the detected landmarks against the corresponding features on the digitalized floor map, the system localizes the user, and provides verbal instruction to guide the user to the desired destination. The software modules of our system are implemented in Robotics Operating System (ROS). The prototype of the proposed assistive navigation system is evaluated by blindfolded sight persons. The field tests confirm the feasibility of the proposed algorithms and the system prototype.

[1]  Frank Dellaert,et al.  SWAN: System for Wearable Audio Navigation , 2007, 2007 11th IEEE International Symposium on Wearable Computers.

[2]  Jizhong Xiao,et al.  Being Aware of the World: Toward Using Social Media to Support the Blind With Navigation , 2015, IEEE Transactions on Human-Machine Systems.

[3]  Jizhong Xiao,et al.  Semantic Indoor Navigation with a Blind-User Oriented Augmented Reality , 2013, 2013 IEEE International Conference on Systems, Man, and Cybernetics.

[4]  David J. Calder,et al.  Assistive technologies and the visually impaired: a digital ecosystem perspective , 2010, PETRA '10.

[5]  Jizhong Xiao,et al.  A low cost outdoor assistive navigation system for blind people , 2013, 2013 IEEE 8th Conference on Industrial Electronics and Applications (ICIEA).

[6]  Yingli Tian,et al.  Text extraction from scene images by character appearance and structure modeling , 2013, Comput. Vis. Image Underst..

[7]  Roberto Manduchi,et al.  A Tool for Range Sensing and Environment Discovery for the Blind , 2004, 2004 Conference on Computer Vision and Pattern Recognition Workshop.

[8]  Joongsun Yoon,et al.  A robotic cane based on interactive technology , 2002, IEEE 2002 28th Annual Conference of the Industrial Electronics Society. IECON 02.

[9]  Nikolaos G. Bourbakis,et al.  Wearable Obstacle Avoidance Electronic Travel Aids for Blind: A Survey , 2010, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[10]  Jizhong Xiao,et al.  Visual semantic parameterization - To enhance blind user perception for indoor navigation , 2013, 2013 IEEE International Conference on Multimedia and Expo Workshops (ICMEW).

[11]  R. Alonso,et al.  Pedestrian tracking using inertial sensors , 2009 .

[12]  Yong Li,et al.  Seamless outdoor/indoor navigation with WIFI/GPS aided low cost Inertial Navigation System , 2014, Phys. Commun..

[13]  F. Seco,et al.  A comparison of Pedestrian Dead-Reckoning algorithms using a low-cost MEMS IMU , 2009, 2009 IEEE International Symposium on Intelligent Signal Processing.

[14]  Eric Foxlin,et al.  Pedestrian tracking with shoe-mounted inertial sensors , 2005, IEEE Computer Graphics and Applications.

[15]  Elke E. Mattheiss,et al.  Route Descriptions in Advance and Turn-by-Turn Instructions - Usability Evaluation of a Navigational System for Visually Impaired and Blind People in Public Transport , 2013, SouthCHI.

[16]  Jizhong Xiao,et al.  Fast visual odometry and mapping from RGB-D data , 2013, 2013 IEEE International Conference on Robotics and Automation.

[17]  R. Velazquez,et al.  Walking Using Touch: Design and Preliminary Prototype of a Non-Invasive ETA for the Visually Impaired , 2005, 2005 IEEE Engineering in Medicine and Biology 27th Annual Conference.

[18]  Bing Li,et al.  An Assistive Navigation Framework for the Visually Impaired , 2015, IEEE Transactions on Human-Machine Systems.

[19]  Xiaodong Yang,et al.  Toward a computer vision-based wayfinding aid for blind persons to access unfamiliar indoor environments , 2012, Machine Vision and Applications.