BERT has arrived, sir!
Google has made one of the biggest changes to its algorithm. It will reportedly affect the SEO of at least every tenth site (read the official explanation from the Google blog). The change is called BERT, and it means Bidirectional Encoder Representations from Transformers. I won’t even bother to further explain, but only:
This means that the BERT Google algorithm will now recognize the natural semantics of the language. It will understand individual words as part of the context. Wow!
How does the BERT Google algorithm work?
This technology is based on a “transformers” model: BERT will allow you to write your SEO copywriting in a much more natural tone. From now on, not only that keywords will be relevant, but all those words that come before, or after, in a sentence. Thus Google’s BERT will take into account the whole context. This is very important for the search intent, that is, the intention which lay behind the search. As for the English language, it will take into account conjunctions, words, adverbs, such as “for” or “it,” because of their place in the sentence matters.
The application of BERT will affect both the ranking on the SERP itself and the representation it responds to by searching through featured snippets, which will make it easier for the user to reach the information of interest to his/hers search.
As Google announces, this change will affect one in 10 sites, and the whole system is quite human-centric, therefore, oriented towards the actual needs of the person searching, rather than the business ambitions of the site and information owners.
BERT is human-centric: you will be able to search the conceptual query in the most natural way in your language to get resolts you serached for.
That’s why Google thinks BERT will affect every tenth site: it is obvious that one out of 10 SEO copywriting techniques on the web is created with a focus on keywords rather than the natural place of words as the key to the meaning of a sentence.
The keyword is no longer that important, it is the word that is becoming a key-player to the meaning.
Practically, what is BERT
How Google explains: we have a piece of search information:
"2019 brazil traveler to usa need a visa."
The word “to” and its connection to the rest of the sentence clearly indicates that the user is from Brazil and wants to go to America. And that becomes important to Google’s BERT, he understands the point.
Previously, Google would not understand the query exactly and would return the result to an American traveling to Brazil. However, Google is now able to understand this nuance and knows that the tiny suggestion “to” is actually very important here, and so BERT is now providing a much more relevant result for this query.
What does BERT mean for SEO and “smaller” languages?
BERT is an artificial intelligence (AI) algorithm to be applied to all the languages in the world, not just English, which is the most queried language. A powerful feature of this system is that it can learn by analogy, so what works for English will be applicable to all other languages as well.
It is to be expected that languages that have irregular verbs, case words, gender and number will now have a far simpler task when it comes to creating SEO content, as BERT will understand other grammatical variables of words. This will allow the content to return to the spirit of the language, rather than forcing the language to please SEO, sacrificing the natural tone of the content.
BERT will adapt to the language. You will not need to adapt the language (content) to Google.
EDIT: Google has annonced during december that they rolled out BERT to over 70 languages wordwide. Among them are Serbian, SLovenian, Croatian...
As for featured snippets (prominent snippets of relevant content at the top of the first Google page), obviously smaller languages will still have to wait to get that most valuable, free marketing space on SERP.
For now, the good news is that writing for the web will be more natural, and more in the spirit of the language, as it is searched.
This text was originally published at Kolegijum Blog.