Understanding how search engines work is an essential step to increase your SEO knowledges.
Indeed, if you know how it works, you will know what to do on your website to improve your rankings.
What is the role of search engines ?
Search engines role is simple: to provide users the content they need.
How Google works?
We could define how Google works with 4 different steps:
- serve results
The final step for search engine is to deliver the search results to the end users.
How does Google crawl the Web ?
In the SEO world we use the word “crawl” to define the search engines process to explore the Web.
The Web is composed by a billions of documents, including webpages, pdf, images, and more.
To discover content, Google bots (also called Crawlers or Spiders in the SEO industry) browse the Web from one page to another by following links they found on their way.
This is why internal linking and backlinks are so important for SEO.
Note: to empeach Google-bots to explore certains parts of your website, you can use the “disallow” command in your robots.txt file.
How does Google index your pages ?
Once Google find new pages, it needs to index them to what we call Google’s Index.
Google Index contains billions of pages that Google have crawled across the years.
According to the meta-tag used, whether it’s “index” or “no-index”, Google will add or not the page to it’s index.
If you want to know how many pages of your website are indexed by Google, you can use the command “site:yourdomain.com” on Google’s search bar.
Once a page is in the Index, it’s analysed and ranked according to hundreds of factors.
How does Google rank a page ?
To provide search results that match users expectations, Google’s engineers have created different algorithmes that analyse more than 200 factors.
We can distinguish 4 differents area that taken in consideration:
- Authority: how popular is a page / a website / a brand ?
- Content: how relevant is the content ?
- Technical: does it meet Google’s technical requirements ?
- User experience: do users stay on the page ?
To analyse all of these factors, Google use differents algorithmes such as:
Panda algorithm has been launched in 2011. It aims to devaluate website that offer low quality content, such as duplicate content.
Launched in 2012, Penguin was a game changer in the SEO industry. The purpose of this algorithme is to penalise website that use black-hat techniques such as : keywords stuffing, cloaking, spammy links…
Released in 2013, Hummingbird focus on semantic to help Google to have a better understanding of natural language.
In the SEO world, semantic has for main objective to organise data in order to help search engines understand them.
Pigeon has been released in 2014 and aimed to improve local search results.
One of the most recent algorithm, FRED’s goal is to penalise websites that offer a poor user experience such as website that are displaying to many ads or using interstitials popups. Website that have been penalised by this update are mainly website that offers low quality content.
RankBrain is an artificial intelligence linked to HummingBird. The program has been launched in 2015 to improve search intents and natural language queries. It helps Google be better at voice and conversational search.