X
Innovation

Google's Live Transcribe can now spot laughter, clapping and dogs barking

App can now spot new sounds in the environment as well as real-time transcription.
Written by Liam Tung, Contributing Writer

For Accessibility Awareness Day, Google has announced new features for its Live Transcribe app that now describes non-voice sounds in an environment, like a dog barking, in addition to real-time captioning. 

Google announced Live Transcribe in February, offering hearing impaired people who use Android phones captions of face-to-face conversations in real-time. The feature can be enabled in Accessibility Settings on Android devices and can be enhanced by connecting the device to external microphones. 

Hearing aids can be a help but, depending on a person's specific hearing difficulties, they can also make certain background sounds, like a car passing, louder than the noises a user wants to hear, like a friend's voice. Plus the devices are often incredibly expensive. 

Beyond live text transcription, the app is now able to add more context to the description of a sound environment through "sound events", Google announced on Thursday.  

These events are displayed at the bottom of the screen much like a tag. So, in the case of a dog barking, the word "dog" appears in a colored box. And if someone nearby is whistling that word will appear in a separate box in a different color. 

The addition of sound events can be a massive help in a variety of settings where hearing people rely on non-word sounds for social cues to begin participating in a moment that might last only seconds. Even for those who can partially hear, the moment might be over before they've realized it happened. 

Other events the app will flag include clapping, laughter, music, applause, the sound of speeding vehicle, and presumably more as the app is developed. 

Users can now also copy and save transcripts, which are stored locally for three days. This feature extends the app's usefulness beyond hearing impairment and could also be a nifty tool for journalists or students taking lecture notes. 

These new updates will be rolled out in June. 

The only problem I see with Live Transcribe is that the user has to read live captions from their phone during a conversation when normally the person would be looking at the face and mouth of the person they're speaking with. This wouldn't be as much of a problem if say, the user could read the text from a head-mounted lens… Like Google Glass, whose price of $1,500 looks pretty cheap compared to many hearing aids.

lt01-sound-events-copy-text-max-1000x1000.png
Google
Editorial standards