Please login first
AI-Powered Smart Urban Navigation and Safety Alert System for Visually Impaired Pedestrians
* 1 , * 2 , 2 , 3 , 2
1  Department of Artificial Intelligence and Data Science, P R Pote Patil College of Engineering and Management, Amravati-444604 Maharashtra, India
2  Department of Computer Science and Engineering, P R Pote Patil College of Engineering and Management, Amravati-444604 Maharashtra, India
3  Department of Civil Engineering, P R Pote Patil College of Engineering and Management, Amravati-444604 Maharashtra, India
Academic Editor: Lucia Billeci

Abstract:

AI-Powered Smart Urban Navigation and Safety Alert System for Visually Impaired Pedestrians

Navigating urban environments remains a significant challenge for visually impaired individuals due to dynamic obstacles, unclear signage, and inconsistent auditory cues. This research presents an AI-powered smart urban assistive system designed to provide real-time navigation guidance and safety alerts to visually impaired pedestrians. The system integrates computer vision, sensor fusion, GPS-based routing, and audio feedback mechanisms to create a wearable and responsive mobility aid.

A prototype was deployed in both controlled campus and real-world urban scenarios involving 20 visually impaired users across various navigation tasks (e.g., crossing roads, avoiding obstacles, locating entrances). The proposed system achieved an accuracy of 93.2% in obstacle detection, 87.6% route adherence, and 95% user satisfaction in safety perception, significantly outperforming traditional cane-based mobility methods.

In conclusion, the AI-driven smart assistive system demonstrates high potential in enhancing urban mobility for the visually impaired. It empowers users with greater independence, real-time situational awareness, and reduced anxiety in navigating complex environments. The modular and scalable design ensures adaptability to various urban infrastructures and user preferences. Future work will explore integration with edge AI for offline inference, voice-command interfaces, and context-aware navigation recommendations based on pedestrian density and time of day, contributing toward inclusive smart cities in alignment with Sustainable Development Goals (SDG 11: Sustainable Cities and Communities).

Keywords: computer vision, sensor fusion, GPS-based routing, Smart Navigation System
Comments on this paper
Currently there are no comments available.


 
 
Top