Loup Ventures tested four smart speakers by asking Alexa, Siri, Google Assistant, and Cortana 800 questions each. Apple’s personal assistant shows big improvement over last year.
![](/wp-content/uploads/archive/C6658F50-CC07-43B9-9B54-BBBB7EBAF69D.jpg)
![](/wp-content/uploads/archive/C6658F50-CC07-43B9-9B54-BBBB7EBAF69D.jpg)
Google Assistant was able to answer 88% of the questions correctly vs. Siri at 75%, Alexa at 73%, and Cortana at 63%. Last year, Google Assistant was able to answer 81% correctly vs. Siri at 52%, Alexa at 64%, and Cortana at 56%.
Loup Ventures asked each smart speaker the same 800 questions, and they were graded on two metrics: 1. Did it understand what was said? 2. Did it deliver a correct response? The question set, designed to comprehensively test a smart speaker’s ability and utility, was broken into five categories. Testing was conducted on the second generation Amazon Echo, Google Home Mini, Apple HomePod, and Harman Kardon Invoke.