By Kalwinder KaurMay 21 2012
Scientists at the University of Nevada, Reno have developed an indoor navigation system for blind people which uses motion-planning and human-computer interaction. Called as Navatar, the low-cost navigation system can be accessed from a normal smartphone.
Conventional indoor navigation systems include fitting radio-frequency tags in rooms and corridors and using handheld readers for finding out the location of the user. Other systems use heavy, expensive sensors. Navatar, on the other hand, uses low-cost sensors and digital two-dimensional architectural maps for helping users with visual impairments to navigate the building. Based on the special requirements of the user, the Navatar system locates, tracks and provides the instructions to the destination.
Presently, the sensors in the smartphone have the tendency to pick up false signals. They are used to calculate the number of steps taken by the user. The Navatar system combines the ability of blind people to detect intersections, doors, elevators and other such landmarks through touch and probabilistic algorithms to determine the location of the person. The system provides directions through synthetic speech.
The researchers are trying to integrate the system with outdoor navigation systems utilizing GPS. The navigation system may also benefit users with vision.
Kostas Bekris and Eelke Folmer, the researchers, won a PETA Proggy Award for Leadership in Ethical Science for their navigation system. They had presented the system at the CM SIGCHI Conference on Human Factors in Computing Systems and the IEEE International Conference on Robotics.
Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.