Google announced what they called the most important update in five years. The new update is known as BERT to help improve its ability to better understand human language in organic search results which will impact 10% of search queries.
Rolled out in late October 2019, BERT has been labeled as Google’s biggest algorithm update since RankBrain – a machine learning algorithm that aims to better understand queries and content on a page. Whilst BERT does not replace RankBrain (or other language algorithms), it does support this activity, affecting 1 in 10 searches and the way websites rank for longer, more conversational queries.
BERT is a Major Google Update
According to Google, this update will affect complicated search queries that depend on context.
This is what Google said:
“These improvements are oriented around improving language understanding, particularly for more natural language/conversational queries, as BERT is able to help Search better understand the nuance and context of words in Searches and better match those queries with helpful results.
Particularly for longer, more conversational queries, or searches where prepositions like “for” and “to” matter a lot to the meaning, Search will be able to understand the context of the words in your query. You can search in a way that feels natural for you.”
With the latest advancements from Google’s research team in the science of language understanding, made possible by machine learning, they’re making a significant improvement in understanding search queries, representing the biggest leap forward in the past five years, if not one of the biggest leaps forward in the history of Search.
So What Is BERT?
Bert stands for Bidirectional Encoder Representations from Transformers has been described as a neural network-based technique for training and supporting natural language processing
You are probably wondering, what the heck does that mean, right?
Google, in essence, has adjusted its algorithm to better understand natural language processing.
Just think of it this way: you could put a flight number into Google and they typically show you the flight status. Or a calculator may come up when you type in a math equation. Or if you put a stock symbol in, you’ll get a stock chart.
Or even a simpler example is: you can start typing into Google and its autocomplete feature can figure out what you are searching for before you even finishing typing it in.
BERT is helping Google to become more human with a better understanding of conversational language and natural everyday phrasing – very similar to the way humans learn a new language in both formal and conversational elements.
Described in detail on Google’s blog by Vice President of Search, Pandu Nayak, the update was focused on understanding ‘language’ to help guide browsers, in particular with searches that Google hadn’t seen before. Nayak describes this below as a social responsibility
Is Bert even useful?
The latest update will affect organic ranking results and feature snippet text, with long-tail search queries now becoming a more apparent search engine optimisation (SEO) strategy for sites wanting to rank for specific phrase-based search terms. Terms that specifically have a conversational tone or intent.
BERT models are able to consider the context in search terms by looking at the words which appear before and after keywords, leading to ‘intent’ to better match the browser with an ideal search result. Prepositions such as ‘for’ and ‘to’ will now begin to become a much more important ranking factor in typical searches due to their influence on context.
Below are two examples of how the BERT update is now affecting search engine results pages or SERPs:
First example… “2019 brazil traveler to usa need a visa”
Before Bert, the top result would be how US citizens can travel to Brazil without a visa. But look at the search query carefully… it’s slight, but it is a big difference.
The search wasn’t about US people going to Brazil, it was about people from Brazil traveling to the US.
The result after the Bert update is much more relevant.
Google is now taking into account prepositions like “for” or “to” that can have a lot of meanings to the search query.
The second example… “can you get medicine for someone pharmacy”
The operative word of ‘for’ becomes key in the above search for “Can you get medicine for someone pharmacy”.
Prior to the BERT update, this vague search would have returned general results for filling a prescription in a pharmacy, without realizing the intent of the search in fact is to pick up a relative’s prescription.
What Does Bert Mean For Us?
technical details, but what does it all mean for you? Well, by applying BERT models to both rankings and featured snippets in search, we’re able to do a much better job helping you find useful information.
In fact, when it comes to ranking results, BERT will help Search better understand one in 10 searches in the U.S. in English, and we’ll bring this to more languages and locales over time.
Pandu Nayak:- Google
No matter what you’re looking for, or what language you speak, we hope you’re able to let go of some of your keyword-ese and search in a way that feels natural for you. But you’ll still stump Google from time to time. Even with BERT, we don’t always get it right. If you search for “what state is south of Nebraska,” BERT’s best guess is a community called “South Nebraska.” (If you’ve got a feeling it’s not in Kansas, you’re right.)