Google has revealed an upcoming app that is specifically designed to help blind or visually impaired people become more independent by giving spoken notifications as they encounter objects, text and people around them.
The app is dubbed as Lookout, and it was announced during the I/O developer conference earlier this week. Google's Lookout offers four modes: home, work & play, scan and experimental (the latter allowing users to test out features being worked on by Google).
Lookout then delivers information relevant to that activity, for example, it can tell these users that there is a 'couch 3 o'clock', meaning an object to be aware of is on their right.
Much like Google search engine, machine learning will increase the relevancy of results as more people use the app. Google does, however, recommend that users wear their smartphone as a lanyard around their neck or in a shirt pocket, with the camera pointing away from the body.
Once a mode is chosen, Lookout processes information from within the environment such as text from a recipe book, location of a bathroom, an exit sign or a chair. The app then notifies the user with minimal interaction, ensuring people stay engaged with their activity.