X
Innovation

Google: Search's ability to understand you just made its 'biggest leap in 5 years'

Google's new BERT models demand new cloud TPUs for serving search results.
Written by Liam Tung, Contributing Writer

Google says its newest algorithms for answering questions, nicknamed BERT, have delivered the biggest improvements to understanding queries in the past five years. The last major boost it got was from RankBrain in 2015.    

BERT is short for Bidirectional Encoder Representations from Transformers, a machine-learning technique for generating models in the field of natural-language processing. 

BERT can guess a word by looking at the words before it and after it in a sentence, making it 'bidirectional'. Google's BERT one-upped the 'unidirectional' OpenAI GPT, which only looked at words before the one being guessed in a sentence. 

The bidirectional aspect of BERT models offers more context about a word, which should help Google Search understand the intent behind search queries, especially where prepositions like 'to' and 'from' matter to meaning, according to Pandu Nayak, Google fellow and vice president of Search.  

Google open-sourced BERT last year and detailed the research in its AI blog at the time. According to Google, the complexity of BERT models has necessitated deploying its Cloud TPUs to deliver search results. 

Besides search ranking, Google is also applying BERT models to featured snippets. Google says BERT will improve its ability to understand about 10% of search queries in English in the US. Gradually it will bring the new natural-language processing models to more languages and markets. 

The new models should help Google Search better understand the 15% of queries it gets every day that its systems have never seen before. 

Nayak offered a few examples where Search previously missed the importance of 'for' and 'to' in delivering results.  

One example is: "2019 brazil traveler to usa need a visa", where 'to' is important to understand that it's a Brazilian who is traveling to the US. Before BERT, Google's algorithms returned results about Americans traveling to Brazil. 

Another is: "can you get medicine for someone pharmacy". Before it delivered general answers about prescriptions but the BERT model understands that the query is about whether you can pick up a prescription for another person. 

As for languages other than English, Google says it is can use BERT learnings from language and apply them to others.

More on Google and search

Editorial standards