X
Innovation

How smart are Alexa, Cortana, Google Assistant, and Siri in answering your questions?

You may not be as frustrated with your digital assistant as you used to be. A new study shows that these assistants are improving year on year.
Written by Eileen Brown, Contributor

Video: Google's Assistant gets an AI upgrade with Duplex

You might have become tired of asking your digital assistant a whole range of questions to judge its accuracy -- but one company has taken that task to its heart.

Framingham, Mass.-based Stone Temple has updated its 2017 study comparing popular digital assistants.

Read also: Google overtakes Amazon in smart speaker market in Q1 2018

It pitted Google Assistant on a Google Home and on a smartphone, Amazon Alexa, Microsoft's Cortana, and Apple's Siri in a head-to-head intelligence test.

To make both studies comparative, it used and asked the same questions as it did in the 2017 study.

It asked 4,952 different questions to each of the five contestants, and noted many different possible categories of answers.

These responses included whether the assistant answered verbally, and whether an answer was received from a database (like the Knowledge Graph).

It checked if an answer was sourced from a third-party source ("According to Wikipedia ..."), how often the assistant did not understand the query, and when the device tried to respond to the query but simply got it wrong.

How smart are Alexa, Cortana, Google, and Siri in answering your questions ZDNet
(Image: Stone Temple)

The 2018 study showed that Google Assistant has maintained its No. 1 position across both platforms -- smartphone and Google Home -- by attempting to answer and correctly answering the most queries.

Alexa showed the largest year-over-year improvement. This year, it attempted to answer 2.7x more queries than it did last year, rising from 19.8 percent to 53.0 percent of questions it attempted to answer.

How smart are Alexa, Cortana, Google, and Siri in answering your questions ZDNet
(Image: Stone Temple)

Microsoft's Cortana Invoke saw the second largest increase in attempted answers, going from 53.9 percent to 64.5 percent. This was followed by Siri who increased its score from 31.4 percent to 40.7 percent.

Although most devices were consistent with the 2017 study in where they sourced the information for their answers, Siri showed the largest shift. It had 23 percent of its results that were sourced from third-party information -- also called featured snippets.

Every competing personal assistant has made significant progress since 2017 in closing the gap with Google's Assistant.

For accuracy in answering questions, the only assistant that increased its accuracy year over year was Cortana Invoke, which went from 86.0 percent to 92.1 percent of questions answered correctly.

But many of us are still reluctant to use voice commands in public places. Will voice technology truly replace our point-and-click mentality? As we grow to trust the accuracy of our digital assistants, then our use of them will grow.

Google Assistant's big update: All the new AI tricks and features, explained

Previous and related coverage

So Cortana is smarter than Siri -- but Google Assistant is smartest (for now)

Virtual assistants are becoming ubiquitous in our work and home lives. Many of us own at least one personal assistant -- either Siri on iPhone or Cortana on Windows 10 -- but which one is better and more accurate at responding to our requests?

Google Home vs Amazon Alexa: Which one is smarter?

Stone Temple has been asking Google Home and Amazon Alexa a lot of questions to find out which voice-activated device gives the most correct answers.

Editorial standards