We explain Google Bert in a nutshell

Can people's curiosity really be satiated? And most importantly, Google really can answer any question it is asked? The answer is, essentially no. Pandu Nayak confirmed this to us in a speech, arguing that Google receives billions and billions of "queries" every day, of which almost 15% are new or never seen before. Not a small percentage, if we stop to think. And then, what to do?

Must be entered an algorithm that predictively answers questions which cannot be anticipated. Please welcome Bidirectional Encoder Representations from Transformers aka Google friend Bert. Google Bert is, explained in simple terms, a neural network that processes natural human language and sets itself the goal, succeeding, of better understanding the context of the query so as to provide more pertinent responses, with crossed comprehension thresholds among them, which in any case have such a good margin of precision that it even seems a bit disturbing.

Google Bert was officially released on December 9, 2019 in Italy.

Neural network voodoo

It looks like black magic, but it's actually a complex system of mathematical and statistical models that put together… really work. We are talking about neural networks, or those intelligent systems that learn from a database and from the information entered by people. Systems that, in a nutshell, evolve by themselves, fed by people, to satisfy their curiosities. The magical thing is that they really work, even if there's absolutely nothing "witchy" about them. A neural network learns from its surroundings and grows, reaching conclusions it sometimes doesn't have the data to reach with the power of relating the information it has at its disposal. If you want to know more about neural networks, ask an expert. This definition comes from a girl who, one day, came across a wonderful article in English on neural networks and developed awe in their incredible power. I'm just a novice approaching the subject. I'll leave the technical explanation to the competent person.

Now, back to Google Bert.

Google Bert (and its much talked about update)

BERT is the new Google algorithm update, a new search engine attempt to match searches with the most relevant results. Natural Language Processing is the foundation of this update and consists in adding all the Google algorithms to understand the words of the phrase entered in the search box and give an overall meaning to the phrase, in its semantic context. A simply revolutionary novelty.

BERT examines the context and does not just analyze the words taken individually. In this way, if you google "I caught a crab", you will not get results relating to the king crab of the Atlantic coasts of the American continent, but only results in relation to the fact that "you were wrong". This example is very generalist and doesn't perfectly describe what Google Bert can actually do, but it gives you an idea of ​​the general concept behind the same.

Google Bert: How can it help your blog's SEO?

Ok, all very nice, but how will this BERT change the way I use the SEO of my website? How will my ranking change?

Some say that the influence will probably be minimal and that BERT will not have the transcendental impact with which it was painted. In reality, something could change. We always remain within the sphere of possibilities because, as you well know, Google's algorithms are the fourth secret of Fatima.

BERT could do you lose traffic when your website has not clearly defined the context in which it operates. Also, the keywords a long tail (the long tail keywords) could assume an even more preponderant importance within the staff of a piece.

What do you think about Google BERT? How will it change our approach to onpage and offpage SEO?