Google’s BERT Explained | Quick Wins #6

Bidirectional Encoder Representations from Transformers - BERT

Google just rolled out its largest update to search in about five years. It’s the most significant update since RankBrain.

The changes will affect 1 in 10 search results as well as featured snippets.

In other words, this is a pretty important update for anyone who uses search engines. And unless you live under a rock, you use Google.

BERT stands for Bidirectional Encoder Representations from Transformers. Yes, that’s about as clear as mud. Let’s break it down in plain English.

BERT is going to help Google understand the context of searches. In other words, Google will be able to better answer the questions people are asking. That means opportunity for those of us willing to put in the time and effort to produce badass content that answers very specific questions.

Google is putting less and less emphasis on keyword density and other traditional factors. Meanwhile, they’re putting more emphasis on quality content that answers very precise questions.

I’ve written about search intent in the past and how we all need to understand the intent of our customers when they type something into the Google box. Understanding intent and how people are searching just took another big step forward. Focus less on keywords and more on precisely answering the questions your customers have.

If you don’t have tools you use for research, here are my two favorite tools:

Ubersuggest

SEMRush

All you have to do for a more productive Friday is PUSH RIGHT HERE.