Autonomous service robots are increasingly being used for cleaning, delivering stuff, patrolling, and other tasks like inspection. These robots often use the same passages which are used by people for navigation to specific areas. Robots are equipped with visual sensors, laser or sonar based range estimation sensors to avoid collision with obstacles, people, and other moving robots. However, these sensors have a limited range and are often installed at a lower height (mostly near the robot base) which limits the detection of far-off obstacles. In addition, these sensors are positioned to see forward, and robot is often 'blind' about objects (ex. people and robots) moving behind the robot which increases the chances of collision. We propose to use a network of external cameras fixed on the ceiling (ex. surveillance cameras) to guide the robots by informing about moving obstacles from behind and far-off regions. This enables the robot to have a 'birds-eye view' of the navigation space which enables it to take decisions in real-time to avoid the obstacles efficiently. The camera sensor network is also able to notify the robots about moving obstacles around blind-turns. A mutex based resource sharing scheme in camera sensor network is proposed which allows multiple robots to intelligently share narrow passages through which only one of the robots/person can pass at a given time. Experimental results in simulation and real scenarios show that the proposed method is effective in robot navigation in crowded and narrow passages.
Previous Article in event
Previous Article in session
Next Article in event
Intelligent Robot Guidance in Fixed External Camera Network for Navigation in Crowded and Narrow Passages
Published:
15 November 2016
by MDPI
in 3rd International Electronic Conference on Sensors and Applications
session Sensors Networks
Abstract: