Skip to main content
(02) 4948 8139 0421 647 317

Google Rolls Out the Latest Search Algorithm: BERT

Google recently made the largest change to its search system since the company introduced RankBrain, almost 5 years ago. Designed to better understand what’s important in natural language queries, it’s colloquially known as BERT and is a significant change. Google said it impacts 1 in 10 queries, yet many SEOs and many of the tracking tools did not notice massive changes in the Google search results while this algorithm rolled out in search.

It was last year that Google developed and open-sourced the neural network-based technique for natural language processing (NLP) pre-training, called Bidirectional Encoder Representations from Transformers (or BERT). This technology enables anyone to train their own state-of-the-art question answering system. Put simply, BERT can help computers understand language a bit more like humans do.

This breakthrough was the result of Google research on transformers: models that process words in relation to all the other words in a sentence, rather than one-by-one in order. BERT models can therefore consider the full context of a word by looking at the words that come before and after it, particularly useful for understanding the intent behind search queries.

This new BERT update really was around understanding “longer, more conversational queries,” according to Google’s recent blog post. Most SEO tracking tools primarily track shorter queries, which means that the BERT update’s impact is less likely to be visible to these tools and is why SEO’s didn’t notice much change to the rankings.

When it comes to ranking results, BERT will help Search better understand 1 in 10 searches in the U.S. in English (and Google will bring it to more languages and locales over time). Many SEOs are wondering how they can improve their sites for the new BERT rankings, but Google has already stated there is no real way to optimise for it and Danny Sullivan from Google has said “there’s nothing to optimise for with BERT, nor anything for anyone to be rethinking. The fundamentals of us seeking to reward great content remain unchanged.”

Its function is just to help Google better understand searchers’ intent when they search in natural language, particularly for longer, more conversational queries, or searches where prepositions like “for” and “to” matter a lot to the meaning. The upside for SEOs and content creators is they can be less concerned about “writing for the machines” but can still focus on writing great content for real people, which is what continues to get the best SEO results.

As with when Rankbrain was introduced, Google is looking for quality of content and the reputation of the page or domain to provide good, reliable results to the searcher. A website may establish a good reputation for fresh content, with a good depth of information and relevancy to a search term. As with many of the signals in SEO, the issue of user experience is closely linked to the elements that will help a website rank well for a chosen search term.

If you’d like to know more about BERT and it’s impact on your business’s SEO, please get in touch.