14
Nov
Ahmed - google-Bert-algo

What Is the BERT Algorithm and How Will It Affect Google Search?

One of Google’s most recent search updates has brought with it the BERT algorithm, which is designed to improve Google’s understanding of user intent behind each search, subsequently providing more relevant results for users.

According to Google, BERT will affect up to 10 percent of searches and will have an impact on certain brands’ visibility in search engines and traffic to their website, which you may not discover.

Here is a closer look at BERT and what it means for the future of search.

What Exactly Is BERT?

Bidirectional Encoder Representations from Transformers, or BERT, is a neural network-based method for the pre-training of natural language processing. Put simply, it’s a new technique used to help Google parse individual search queries and determine the specific context of words that people use.

For instance, for the terms “concerts in town” and “drops in traffic,” “in” would have a completely different context. This is obvious to us, but not necessarily to a search engine. BERT allows Google to identify the specific nuance of a word to provide people with results that are truly relevant to what they’re trying to find.

In November 2018, Google officially open-sourced BERT to allow anyone to utilize this tool to train a language processing system for the purpose of optimizing answers or other similar tasks.

What Are Neural Networks?

These are elements of algorithms essentially designed to improve pattern recognition. They can categorize images, discern handwriting from typing, and make predictions when it comes to finances, along with many other applications.

They identify patterns following training through data sets. In BERT’s case, Google revealed that BERT used Wikipedia’s plain text corpus for training.

How Natural Language Processing Works

Natural language processing (NLP) is a type of artificial intelligence concerning linguistics. It enables computers to gain a kind of understanding of how humans use language to communicate.

Chatbots, social listening tools, and auto-suggestions on your smartphone are some examples of NLP. While NLP is nothing new in search engines, BERT takes NLP to the next level via bidirectional training.

When Did Google Launch BERT?

Google began implementing BERT in its search engine for queries in the English language around October 21, 2019. This included featured snippets.

BERT will eventually expand to all languages used in Google search, but it’s unclear exactly when it will encompass all of them. In over two dozen countries, BERT is also being used for optimizing featured snippets.

How BERT Works

What makes BERT so innovative is how it can train language models according to an entire set of words in a query or sentence through bidirectional training as opposed to the traditional method of training, which would be from left to right or the combination of left to right and vice versa. BERT can learn the meaning of words based on the words surrounding it, whereas most systems learned context based on words either immediately following or preceding them.

Google refers to BERT as a “deeply bidirectional” system because it determines context starting from the bottom of a neural network that runs deep.

To illustrate how this works, take the word “car” in “car transmission” and “car of a train,” which would have the same representation without context in both terms. Contextual models create representations of each word in a phrase based on all of the other words it contains.

For example, if you said “I got a new car transmission,” undirectional contextual models would consider “car” based on “I got a new” but not “transmission.” BERT, on the other hand, will look at both sides of the word to determine its specific meaning.

Although BERT works to provide users with results that are the most relevant, it isn’t punishing results that might be less so. For instance, Google provided the example query “math practice books for adults,” which, pre-BERT, brought a math book for junior high students to the top of the results.

With BERT integrated, the same query returns top results featuring math books geared toward adults, while simply placing the junior high-level book a little lower down the page.

How Much Does Google Use BERT?

BERT won’t apply to every search, according to Google. It will help improve about one in every 10 English-language searches in the U.S., mainly for longer queries that are more conversational in nature or searches that use a larger number of prepositions such as “in” or “to.”

BERT isn’t likely to be used for queries that are more formal and exclude prepositions, which often include shorter queries and branded phrases. These are more likely to generate relevant results on their own without confusion on Google’s end.

What BERT Means for Featured Snippets

BERT can also affect the search results appearing in snippets.

Google gave another example of how BERT would influence snippets using the term “parking on a hill with no curb.” In the past, Google wouldn’t understand the importance of the word “no” and automatically return results in a featured snippet featuring instructions for parking on a hill with a curb, having placed too much emphasis on “curb” in the query.

BERT took the word “no” into consideration along with other words in the phrase to return a more accurate snippet discussing parking on hills without curbs.

BERT and RankBrain: How They Work Together

You might be thinking that BERT seems similar to Google RankBrain, another tool used to better understand search queries, but they’re actually two separate algorithms used to optimize search results in different ways.

RankBrain makes changes to results by comparing a query to previous similar queries. It then looks at the performance of those past results to determine which results will more appropriately match the current query. Depending on what RankBrain finds, it may adjust the results that the normal algorithm provided. RankBrain also helps return results that contain relevant terms that don’t appear in the original query.

Unlike RankBrain, BERT looks beyond the term at the content around it to determine which results would please the user. Through the bidirectional aspect of its operation, BERT scans the content before and after a word to get a better understanding of the word’s meaning.

Both RankBrain and BERT are used together to understand queries and content in an effort to provide users with the most relevant search results, and therefore BERT won’t serve as a replacement.

BERT’s Impact on Google Assistant

BERT might only apply to Google’s search function, but this could extend to the Assistant. When people perform a search using Google Assistant, which may provide a featured snippet or result from Google, BERT might influence those results.

Why You Should Continue to Optimize for the User, Not BERT

Many marketers may be thinking “how can I optimize for BERT?” Well, in short, you can’t. In fact, you can’t really optimize for search engines anymore since they’re becoming increasingly focused on satisfying the user.

If you want to take advantage of BERT, the best way to maximize your visibility and rankings is to do what works best in every SEO strategy: Optimize for the user. If you’re focused on developing high-quality content that visitors to your website will find informative, educational, and useful, this is the rich content that BERT will help your audience find.

BERT will help you get the results you want in this way, just as it will help the user get the best results based on search intent.

About Ahmed Ibrahim
Ahmed is Managing Partner and Chief Operations Officer at Sandstorm Digital FZE. Ahmed manages the day to day operations of the agency out of our office in Cairo, Egypt. Ahmed's experience includes 10 years of online marketing and advertising, focusing on Arabic SEO, CRO, SEM and Social Media.

Contact Us