In a previous piece, I discussed Amazon's difficulty trying to convince its customers to buy new Alexa-powered smart speakers every year when the devices' core functionality and capabilities haven't changed -- and whether we should be truly "owning" them in the first place.
I haven't upgraded my devices since I purchased the first Generation 1 Echos in March of 2014 when the device premiered exclusively for Amazon Prime and invited customers. I currently own three, one in my bedroom, one in my office, and a Generation 1 Echo Dot in my main bathroom. We also have a Sonos Beam in our living room, which has some, but not all, Alexa capabilities.
For the most part, these devices work just fine. Amazon rolls out updates to the firmware continually and changes its cloud service and Alexa smartphone apps annually that introduce new functionality.
Also: Best smart speakers in 2020: Bose, Google, Apple, Amazon, Sonos, and more
Speakers don't go bad, as the basic loudspeaker hardware can last decades. The embedded microprocessor hardware in the original Echo is solid-state. The logic board is designed to function for a very long time, so short of the product taking an unexpected power surge, it would be highly unusual for it to break or have a catastrophic malfunction. Assuming Amazon continues to allow its first-generation products to use the Alexa cloud service to stay "smart" rather than become abandonware, they should and can remain useful for years to come.
But as I write this, it is the day before Prime Day 2020, the celebration of Amazon mass consumerism. Where many good deals, presumably on Amazon's own devices, can be had. And this year's Amazon Echo, Generation 4, has a new feature previous models have not had -- a Machine Learning (ML) chip, the AZ1 Neural Edge processor.
Now, there's no indication yet that the Gen 4 will be discounted on Prime Day -- the current deal is if you buy two, you get $30 off the original price if you use the ECHO2PK code at checkout. That's not a bad deal, especially if you are a new Echo customer. Still, I think Amazon needs to do better, especially on Prime Day and for their early adopters hanging on with Generation 1 devices.
But what does this AZ1 Neural Edge chip in the Gen 4 do for me if I were to decommission my old Echos? Well, as Amazon has demonstrated, it allows part of the processing from their cloud service that powers Alexa to run directly on the device itself, improving query response time. I have not seen a real-life demonstration of this, so right now, I'm just taking it on Amazon's word that it means Alexa will be faster.
I can see how it might be useful in environments with limited connectivity, such as connecting Alexa to your car on a congested or spotty 4G network, as with the Echo Auto that is currently on sale for Prime Day. The Echo Auto doesn't have the Amazon AZ1; it uses an Intel digital signal processor (DSP) with an "Inference Engine," an ML deep learning chip of Intel's design created for edge network applications.
Also: What is edge computing? Here's why the edge matters and where it's headed
A promise of improved response time when my home broadband is degraded is not enough to convince me to spend $170 on two new Echos even with a discount code. $125, maybe. What I want to see is Amazon providing some actual innovation with the machine learning chip itself.
There's a lot of things you can do with machine learning and audio processing, which quite frankly, we haven't scratched the surface of yet. Voice patterns, in particular, how emotion is sensed, may be useful for understanding stress.
The company recently introduced Alexa Guard Plus, a premium service due to launch at the end of this year. Among the features it has is activity sound detection, which presumably, includes noises that might be sounds of distress and things like windows breaking. The company has not yet said whether that specific feature requires Echo devices with the AZ1 chip or if it will work with legacy devices.
When applied to voice and audio processing, machine learning may have other applications we don't know about yet. Indeed, it might help Alexa to know about a request's urgency or to know if you are annoyed with her responses.
But so far, I am not seeing the "killer app" for Amazon's neural edge processing chip. Show me something cool, something I need, Alexa. Because I'm just not seeing that compelling reason to upgrade yet.
Are you upgrading your older Amazon Echo devices for Generation 4 with the AZ1 for promises of machine learning apps that haven't arisen yet? Talk Back and Let Me Know.