Google BERT & Understanding Natural Language Searches
As well as being our favourite muppet, BERT (Bidirectional Encoder Representations from Transformers) is also an AI language model designed to help Google better understand natural language and deliver better, more intuitive search results.
BERT was introduced in November 2018 and started as part of the search algorithm just over a year later in 2019. The biggest algorithm update in over five years, BERT had a large impact on search results, and is now said to be powering up to 100% of all English language queries in Google.
In the period since its launch BERT has had a dramatic effect on the way search engines respond to different types of queries and the quality of results generated.
What is Google BERT For?
BERT offers a range of benefits for search users by helping them access more accurate and relevant search results using natural language search queries. It is used in longer search queries which use natural language and require a deeper language understanding to deliver the best results. Here’s an example from Google:
“Particularly for longer, more conversational queries, or searches where prepositions like “for” and “to” matter a lot to the meaning, Search will be able to understand the context of the words in your query. You can search in a way that feels natural for you.”
When people search online they often use what Google refer to as ‘keyword-ese’; that is to say they search using word strings which are unlike natural language or how they would naturally ask a question. BERT uses machine learning and natural language processing to understand the nuances of language and the context of the whole search query, not just the main keywords one at a time.
An Example of BERT in Action
Google have provided a number of real-world examples of the impact of BERT, including the one below.
In this first example the word ‘to’ and its position in the query are important to the meaning of the query. Before BERT the search algorithm would not be able to ascertain that this query is specifically about travelling from Brazil to the US (see below) and not the other way round. BERT understands the importance of this word and can therefore provide a better response.
In the second example BERT is able to determine the meaning of ‘stand’ in the context of standing up. Previous models grouped ‘stand’ with ‘stand-alone’ which in this context would not provide accurate results.
How Might BERT Affect My Site?
BERT changed the way NLP (Natural Language Processing) tasks are handled and impacts a large number of queries. It was not created to define ‘good’ and ‘bad’ sites, nor to directly reward or penalise them. Instead, its aim is to deliver more relevant and high-quality search results for users which means more relevant and high-quality traffic for your site. BERT focuses on the user search query and searches for the most relevance and rich content to return for the query.
To make the most of the BERT functionality, site content should be clean, compelling, rich and unique. Low quality and thin content is now even less likely to be picked up for addition in SERPs (search engine results page). In addition, the volume of relevant, helpful content on your site is increasingly relevant for many of the longer tail, natural language searches that BERT Is becoming so good at interpreting.
How Can Innovation Visual Help?
As a marketer, it makes sense to understand the way BERT works to improve your website rankings and achieve greater visibility. We can review and advise on existing content, as well as work with you to create and implement a tactical content plan for your business to help maximise relevant traffic and conversions.
Get In Touch
Get in touch with us to talk about get ready for MUM. You can drop us a line via our form here or give us a call on 0333 772 0509. We’re looking forward to hearing from you.