While Google I/O 2023 was largely dominated by AI announcements, including the Gemini project, Global Accessibility Awareness Day brings additional AI-enhanced vision features to Google’s accessibility apps. These updates aim to improve communication for those with language or literacy challenges and provide better navigation tools for low-vision users.
The Look to Speak app, introduced in 2020, is an eye-tracking application that allows users to select phrases and words that the phone then speaks aloud. Google’s recent update enables users to choose from a variety of emojis and symbols, enhancing communication for those who may not use traditional text. Users can now personalize which emojis and symbols they want to use, making the app more versatile and inclusive, especially for non-English speakers.
Google’s Lookout app, designed to help low-vision users identify objects, has also received significant updates. The new beta Find mode allows users to locate items within a room by pointing their phone’s camera in the general direction. Users can select from categories such as “Seating & Tables,” “Doors & Windows,” and “Bathroom,” and the app will guide them to the desired object by indicating its direction and distance.
Additionally, Lookout now features AI-generated descriptions of images taken within the app. This functionality is powered by a visual model developed by Google DeepMind, the same team behind Project Astra’s AI digital assistant.
Google Maps is expanding its accessibility features. Lens in Maps, which provides descriptions in supported languages, now includes updated voice guidance and screen reader support. Previously, wheelchair accessibility icons were available only on Android and iOS; they are now accessible on desktop as well. Users can filter reviews to find wheelchair-accessible locations, and there is now accessibility information for over 50 million locations, thanks to contributions from Maps users. Additionally, businesses can indicate if they support Auracast for patrons who are deaf or hard of hearing.
Google’s Project Gameface, which uses facial expressions to control on-screen characters, was initially available only on PCs. This feature is now being extended to Android developers for integration into mobile apps, broadening its accessibility reach.
These updates reflect Google’s ongoing commitment to making technology more accessible and inclusive for all users. By integrating advanced AI capabilities into their existing apps, Google continues to enhance the daily lives of individuals with disabilities, making the digital world more navigable and interactive.
By Impact Lab