• Skip to main content
  • Skip to footer
  • Digital Candy
    • COVID-19 Business Resources
    • Advertising Strategy
    • Analytics & Reporting Strategy
    • Content Marketing Strategy
    • Email Marketing Strategy
    • SEO Strategy
    • Social Media Strategy
  • Sweet Results
  • Brain Candy Events
    • Grow with Google
  • Our Adventure
  • Contact Us

Cyberlicious®

Are You Satisfying Your Customer's Sweet Tooth?

Google BERT Update Changes SEO Forever

October 25, 2019 By Ben Guest

Google BERT Update Changes SEO Forever

Pandu Nayak, Google’s Vice President of Search, published an article about the latest Google BERT update calling it “the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of Search.” Working with their research team in the science of language and understanding, with the help of machine learning, they made significant improvements. Google will better understand search queries, i.e., user intent, returning even more relevant results.

Google BERT Update Hey Google, show me chicken noodle soup recipes.
Google Assistant

What’s a BERT Exactly?

When you use Google Search, do you naturally speak to it or think of some cleaver keywords to ask it? BERT is simply a technological advancement to help search engines understand us better. When we use words like “no,” “to,” and “for,” Google will include these words as it was unable to before. This particularly helps with complex and conversational search queries, and allows Google to return even more relevant results. BERT is short for …

Bidirectional Encoder Representations from Transformers

… which is a neutral network-based technique for natural language processing (NLP) pre-training. This open source technology essentially gave developers the ability to train their own state-of-the-art question answering systems.

How BERT Models are Applied to Search

The result of Google’s research on transformers where the BERT models process words in relation to all the other words in a sentence is the SEO game changer. In the past, they use to do it by a one-by-one word order, hence, the focus on keywords. The BERT models can consider the full context of the word by analyzing the words that come before and after it. Google search is now more advanced in understanding user’s intent than ever before.

What about processing power? We would need some really beefed up hardware just to allow these BERT models to process millions of websites and the information within them, right? For the first time ever, Google is using the latest Cloud TPU v3 Pod to deliver search results and provide information much more quickly. The Cloud TPU is the latest generation of supercomputers that Google built specifically for machine learning.

Google BERT Update Cloud TPU v3 POD
Cloud TPU v3 Pod

The BERT Powered Search Results

By applying BERT models to both ranking and featured snippets in Search, Google can do a much better job helping us find useful information. In fact, BERT will help Search better understand every 1 out of 10 searches particularly for longer, more conversational searches, or where prepositions like “to” and “for” impact the meaning of the word. Searching on Google will become much more natural to us like having a conversation directly with someone that can answer every question.

Seems like this is something straight out of a SciFi novel, doesn’t it? As a SEO enthusiast myself, I guess it’s time to really say goodbye to keywords. Check out the impact of the Google BERT update on a few searches below:

  • Google BERT Update Query 2019 brazil traveler to usa need a visa
    A search for "2019 brazil traveler to usa need a visa" shows the word "to" and its relationship to the other words in the query. This is someone traveling to the U.S. from Brazil, and not the other way around. Before BERT, Google search algorithms wouldn't understand the importance of this connection. After BERT, you can see Google is able to grasp this, and understands the word "to" actually means something here. The result is now much more relevant.
  • Google BERT Update Query do estheticians stand a lot at work
    Looking at another query, "do estheticians stand a lot at work," we can see Google was taking an approach of matching keywords. They would match the term "stand-alone" in the result with the word "stand" in the query. In context, it was not the right use of the word "stand." Their BERT models now understands that "stand" is related to the concept of the physical demands of a job, and provides a more useful result.
  • Google BERT Update Query can you get medicine for someone pharmacy
    With the BERT model, Google can better understand "for someone" is an important part of this query. Previously, Google missed the meaning, with general results about filling prescriptions.
  • Google BERT Update Query parking on a hill with no curb
    Before BERT, a search like this would confuse Google since it placed too much importance on the word "curb" while ignoring the word "no." It did not understand how critical that word was to appropriately responding. Google search would return results for parking on a hill with a curb!
  • Google BERT Update Query math practice books for adults
    BERT can better understand that "adult" is being matched out of context, and pick out a more helpful result. Previously, the results page included a book in the "Young Adult" category.

Join the Google BERT Update Conversation

Join the conversation on Facebook, Twitter, or LinkedIn.

Filed Under: Digital Marketing Blog, SEO Strategy

Footer

It’ll Be Sweet

When We Meet!

Cyberlicious
360 Central Ave Suite 800
St. Petersburg, FL 33701
e: info@cyberliciousinc.com
t: +1.727.820.1988

St. Petersburg Chamber Proud Member, Cyberlicious, Inc.

Our Adventure
Brain Candy
Sweet Results
Digital Candy Blog

Copyright © 2021 · CYBERLICIOUS® · Privacy Policy · All Rights Reserved