Amazon on Monday announced a new feature for the Echo Show that should make life easier for blind and visually impaired users. With the Show and Tell feature, the smart speaker's camera can recognize household pantry items. Users simply hold the item up to the display screen and ask, "Alexa, what am I holding?"
The feature is now available to Alexa customers in the US on first and second-generation Echo Show devices.
"We heard that product identification can be a challenge and something customers wanted Alexa's help with," Sarah Caplener, head of Amazon's Alexa for Everyone team, said in a statement. "Whether a customer is sorting through a bag of groceries, or trying to determine what item was left out on the counter, we want to make those moments simpler by helping identify these items and giving customers the information they need in that moment."
Amazon's "Alexa for Everyone" team worked closely with blind Amazon employees to develop the feature, as well as the Vista Center for the Blind and Visually Impaired in Santa Cruz, California.
AI-driven features like Show and Tell can help Amazon and other tech companies become an integral part of the lives of the many people living with disabilities. As Amazon noted, there are an estimated 1.3 billion people in the world with some form of vision impairment, according to the World Health Organization. Fifteen percent of the world's population experience some form of disability.
While Amazon's latest feature is geared toward the visually impaired, other efforts have taken Alexa into senior living communities to make life easier for the elderly.
Other technology companies are also designing with accessibility in mind. For instance, earlier this year Google rolled out a tool called Live Relay, which helps deaf people use the phone, even if they prefer not to speak.