X
Innovation

Google Lookout uses AI to describe surroundings for the visually impaired

Using similar underlying technology as Google Lens
Written by Campbell Kwan, Contributor

Google has launched its Lookout app, which uses artificial intelligence (AI) to help the visually impaired see by pointing their phone at objects and receiving verbal feedback.

Lookout uses similar underlying technology as Google Lens, Google said in a blog post, to provide feedback, earcons, or other continuous signals to the user. It also functions in the same way as Lens -- receiving information and providing feedback based on what is captured on the device's rear camera.

The app reportedly assists users in situations such as learning about a new space for the first time, reading text and documents, and completing daily routines such as cooking, cleaning, and shopping, Google said.

Lookout was first announced at last year's Google I/O developer conference. 

It is now available at the Google Play Store for all Pixel devices running Android 8.0 Oreo and is currently only available in English.

To use Lookout optimally, Google recommended that the user's device be placed on a lanyard around the user's neck or in the front pocket of a shirt. Google noted, however, that Lookout is still a new technology and "will not always be 100 percent perfect".

"Lookout detects items in the scene and takes a best guess at what they are, reporting this to you," Google Accessibility Engineering product manager Patrick Clary said.

lookout-modes-max-1000x1000.png

A screenshot image of Lookout's modes, including "Explore," "Shopping," and "Quick read", as well as a second screenshot of Lookout detecting a dog in the camera frame.

(Image: Google)

The search engine giant previously stressed the importance of designing for accessibility, and has released various accessibility apps to improve the user experience for people with disability as of late.

It launched two new apps for Android last month, Live Transcribe and Sound Amplifier, which were designed to help the deaf and hard-of-hearing community.

Live Transcribe, like its namesake, uses a phone's microphone to automatically transcribe real-world speech into captions in real-time.

Sound Amplifier -- which was also announced at last year's Google I/O -- uses a phone and a set of headphones to filter, augment, and amplify sounds so that users can better hear conversations or announcements in noise-heavy environments.

Google also rolled out automated closed captioning to Google Slides for US English in October, a feature that Google said makes presentations more accessible to audiences that are deaf or hard of hearing.

Related Coverage

Google hinting at accessibility initiative for disabled?

Judging by today's Google logo (spells "Google" in Braille), the company is either (a) celebrating Louis Braille's birthday (he was born on January 4, 1809), (b) about to engage in some new accessibility initiative, or (c) both. Anybody know?

Google Live Transcribe and Sound Amplifier to help deaf and hard-of-hearing community

The pair of apps will make audio more accessible, with Live Transcribe turning real-world speech into captions in real-time, and Sound Amplifier distinguishing sounds using headphones.

Google brings real-time closed captioning to Slides

The feature will make presentations more accessible to audiences that are deaf or hard of hearing.

ACCAN launches Australia's first disability-friendly guide to telco products (TechRepublic)

Australia finally has a telecommunications products resource dedicated to people with disability.

Editorial standards