8 Nov 2019

Google BERT: All About Google's Latest Search Algorithm

3

 

Only a few days ago, Google announced an entirely new search algorithm. They call it BERT.
By stating that BERT is the “biggest change of the last five years” and “one that will impact one in ten searches” Google made it clear that this update is one that every webmaster should pay attention to and take seriously.
Search has changed a lot over the last few years, and if you want to stay relevant and still conquer your search space, then you need to both understand what is happening in the search world and actually do what you ought to do in terms of your search engine optimization. BERT is one of those very important things to look at.
To put things in perspective, BERT will impact one in every 10 search queries in terms of changing the results that rank for those queries. In fact, sites like the New York Times have been affected after seeing a significant drop in their search results and traffic.

 

 

WHAT IS BERT?

BERT stands for Bidirectional Encoder Representations from Transformers. It is a neural network-based technique for natural language processing (NLP) pre-training built to help Google Search understand the language better in order to serve more relevant results. With this update, Google is making the biggest change to its search algo since the introduction of RankBrain five years ago.BERT started rolling out for English language queries at the end of October but will expand to other languages and locales over time, according to Google. But Google says BERT is being used globally, in all languages, on featured snippets.

 

THE TECHNICALITY OF BERT

Yes, BERT is pretty technical on the backend but that's nothing you should worry about. Its technicality really is to help Google Search better understand the nuances and context of words in searches and better match those queries with more relevant results.
It is a technology that can be used by anyone to train their own state-of-the-art question answering system. In short, BERT can help computers understand language more as humans do.
According to Google, "this breakthrough was the result of Google research on transformers: models that process words in relation to all the other words in a sentence, rather than one-by-one in order."Thus, the BERT models consider the full context of a word by looking at the words that come before and after it. This is particularly useful for understanding the intent behind search queries.
But it’s not just advancements in software that can make this possible: Google said they needed new hardware too. Some of the models Google can build with BERT are so complex that they push the limits of what Google can do using traditional hardware, so for the first time, Google is using the latest Cloud TPUs to serve search results and get you more relevant information quickly.
 

WHAT IS IN IT FOR THE SEARCHER?

By applying BERT models to both ranking and featured snippets in Google Search, Google is able to do a much better job at helping searchers find useful information."Particularly for longer, more conversational queries, or searches where prepositions like “for” and “to” matter a lot to the meaning, Google Search will be able to understand the context of the words in the query" says Google. This means searchers can now search in a way that feels natural for them.

 

WHERE TO START AS A WEBMASTER?

The best place to start is that old Google’s advice: Keep producing quality, relevant content that meets your users’ goals, answers their questions, and provides true value.
This advice never changes irrespective of the update Google rolls out to its algorithm. You can use the several freemium SEO tools we offer here at Small SEO Tools to create great content that'll be useful and relevant for your users.
And don't make the mistake of thinking that BERT is replacing RankBrain. In fact, RankBrain is still very much in use.BERT is only a major additional method for understanding content and queries. When Google’s systems think a query can be better understood with the help of BERT, that's what it'll use. If a query is better understood with the help of RankBrain, it'll be used instead. In fact, a single query can use multiple methods, including BERT and RankBrain, for the understanding query.


 

Subscribe to SST