NavCog app helps the blind navigate surroundings

Posted On November 06, 2015 , 8:48 AM Contador HarrisonPeriscope

IBM Research and Carnegie Mellon University scientists have developed app called NavCog, an open platform meant to support the creation of smartphone apps that will enable visually impaired users to better manage their navigation within the surrounding. NavCog app is now available for free at the App Store. The app analyses signals from Bluetooth beacons located along walkways and from smartphone sensors to help enable users to move without human assistance, whether inside campus buildings or outside. Researchers are also exploring additional capabilities for future versions of the app to detect who is approaching and what is their mood.The researchers used the platform to create a pilot app which draws on existing sensors and cognitive technologies to inform the visually impaired on the Carnegie Mellon campus about their surroundings by creating vibrations on smartphones or by “whispering” into their ears through earbuds. The first set of cognitive assistance tools for app developers is now available via the cloud through IBM Bluemix.The open toolkit consists of an app for navigation, a map editing tool and localisation algorithms that can help the blind identify in near real time where they are, which direction they are facing and additional surrounding environmental information.

According to researchers, the computer vision navigation application tool turns smartphone images of the surrounding environment into a 3-D space model to help improve localisation and navigation for the visually impaired.The combination of these multiple technologies is known as “cognitive assistance,” an accessibility research field dedicated to helping the visually impaired regain information by augmenting missing or weakened abilities. Researchers plan to add various localisation technologies, including sensor fusion, which integrates data from multiple environmental sensors for highly sophisticated cognitive functioning, such as facial recognition in public places. Researchers also are exploring the use of computer vision to characterise the activities of people in the vicinity and ultrasonic technology to help identify locations more accurately. They did also point out that from localisation information to understanding of objects, they have been creating technologies to make the real-world environment more accessible for everyone.”With Carnegie Mellon University long history of developing technologies for humans and robots that will complement humans’ missing abilities to sense the surrounding world, this open platform will help expand the horizon for global collaboration to open up the new real-world accessibility era for the visually impaired in the near future,” said a university source in the report.Watch the video below for more.