For the better user experience and
best results, Google always comes with new updates. This time its BERT (Bidirectional
Encoder Representations from Transformers) that process words in relation to
other words in a sentence. It has been claimed that it is one of the biggest
steps for search in the last 5 years. Initially, it will be used on 1 in 10 searches
in the US in English because of the complexity of this model. BERT model will
improve the results in all of the two dozen countries where featured snippets
are available. Before introducing this complicated model, it has gone through
different testing phases and shown convincing results that have refined the
search results. By understanding the connection between the words it has helped
the algorithm in a better way.
Impact
With this change, Google aims to
improve the understanding of queries, deliver more relevant results, and get
searchers used to enter queries in a more natural way.
Google did not say to what extent
this change will affect search rankings. Given that BERT is only being used on
10% of English queries in the US, the impact should be minimal compared to a
full-scale algorithm update.
Understanding language is an
ongoing challenge, and Google admits that, even with BERT, it may not get
everything right. Though the company is committed to getting better at
interpreting the meaning of queries.
BERT Model is a new update which will make the search results much better as user point of view. Thanks for sharing such informative post.
ReplyDelete