Freshmarketers.com
Become expert digital marketer with a detailed course right from the basics to advanced level.
Robots.txt file

SEO algorithms for beginners- Made Easy (2022)

Can search engines be compared to a library? Check out his blog to know why and how. Ps: you have a role too.

How search engine works?

In order to understand search engine algorithms, first you need to understand the hierarchy of search engine results.

There are tons of data on search engines.

Who manages all the data and provides us with the best result for our queries?

Let’s try to see this as a hypothetical example:-

Assume there is a big library with billions of books and you are a visitor. You want to read a book on the topic “Yoga” but how do you choose from thousands of books from the catalogue?

That’s when you go to the Librarian and ask for the best 5 books on Yoga. The librarian is perplexed by your request. What a nutjob he thinks!

But let’s assume that the librarian is a genius and he has read each one of the books in the library. He takes out his best 5 picks and gives them to you.

Now looking at the example, in our daily lives, these fictional characters are actually real.

Library = Search engines (Google)

Visitor = User/Searcher

Topic “Yoga” = Search query

Librarian = Web Crawlers/Spiders

Best 5 books = results on SERP

When users search for a topic on a search engine.

  • In first stage, Web Crawlers/spiders explores the web, reads all the data, and downloads text, images, and videos from web pages. This process is known as Crawling. It regularly looks for new and updated pages and adds them to its existing list of pages. This process is called “URL Discovery”
  • In second stage, the machine tries to understand the page to know what it is about, analyse the content. This process is known as Indexing. It’s a rather complicated process which we will study in other posts.
  • In third stage, when the search query is entered, all the data is analyzed and presented according to the search query. The search results may differ for different users depending on various factors such as location, search history etc.

But how do they choose which results to show? In which categories are web pages indexed?

That’s when algorithms come into play. Crawlers act on several algorithms that have been set up/specified for them. Can we say that algorithm is the brain of Crawlers?!

Having said that, it is also true that there are more than 200 metrics that Google algorithms rely on.

What is a search engine algorithm?

Algorithms are a set of specifications to determine the content relevancy of a website. It determines which website comes on which rank or which page.

Google updates its algorithms every day. The changes can be small or big. Let’s look at the most important changes made:

1. Panda

Date: February 24, 2011

Focus- Quality of content, thin content- low value /quality websites

Earlier, Search engines used to rely on keywords to rank websites.

Content creators would stuff their website with the most searched keywords on their websites which were hampering the quality of user experience but nonetheless, their website was ranking and getting traffic because of this loophole of search engine.

Google launched Panda to eliminate black hat SEO tactics and web spam. It was a punishment for websites that thrived on producing low-quality content, and copied content from other sources known as “low-quality sites” or “thin sites”.

This kind of content is categorised as “content farms”. These websites had poor designs, repetitive content and way too many words stuffed.

On the other hand, it gave websites with original content higher ranks.

It is not exclusively about duplicate content or fishy links. The focus of this algorithm is more on encouraging creators to produce high-quality and unique content.

Its name comes from Google engineer Navneet Panda who was behind this algorithm.

Before implementing this algorithm, engineers examined various website’s on the basis of looks, user experience, trustworthiness, redundancy, originality, information value, ads displayed on pages etc.

To read about this algorithm in depth, I suggest checking out this article on SEJ.

2. Penguin

Date: April 24, 2012

Focus- Link building practices, link spam, anchor text

Earlier, website owners used to engage in unnatural practices of acquiring links. Like link exchanges, buying links, and manipulative link schemes.

Penguin analysed the links pointing towards a website (incoming links)

Let’s say 5 websites are pointing to a website via a link but all of the links have been bought by the website for getting traffic, this algorithm penalized these websites for having poor-quality links.

Another aspect that Penguin covers is anchor text. Anchor texts are the texts which are used to refer to other web pages and have a link in them so when you click on them, a new webpage opens up such as the word “blog” in the last paragraph before Hummingbird starts. Here the word “blog” is an anchor text.

Earlier, website owners manipulated the ranking by using anchor text that was unrelated/irrelevant to its link.

Example: If I put “SEO” as an anchor text with a link to my blog everywhere on the internet- in posts, comments etc so anyone who wants to read about SEO would come to my page by clicking on that link.

Then search engine would be tricked into thinking that my website is the best for anyone who searches for SEO. So my website would rank on a search query on SEO.

This was how manipulation worked before this algorithm.

If you would like to read about Penguin all the way back in 2012, then read this interesting blog.

3. Hummingbird

Date: August 22, 2013

Focus- User intent, Search results

Unlike Panda, Penguin, Hummingbird was a major change in Google search engine. It was not an update in the algorithm but rather a new algorithm was written.

Hummingbird was introduced to understand the user’s intent to provide better search results.

Traditionally, search engines would focus on the keywords in a search query whereas Hummingbird would focus on the meaning behind the query.

The goal was to return with an exact match of the search query by understanding longer search queries.

Ex: If a user searched for “best place for Indian”, the new Google algorithm would be able to pick up the cue which is that the user wants suggestions for good restaurants for Indian food instead of showing places where Indians live.

The algorithm learned to identify important words and words to ignore in a search query. In queries, some words do not matter as much as others.

This algorithm also intended to help “Voice search” by understanding conversational queries/natural language queries. Usually, write writing, we are more specific than in speaking.

Example: If I want to look for carpets in my area. I might type something like “beautiful colourful carpets in my area”. The results will return with carpet sellers around my location.

And if I do it in voice search then I might say something like “Where can I find beautiful corlorful carpets around me?”.

I hope by now you have an idea that how this algorithm completely changed the way we search online. Now the searches are more precise and match exactly our intent, all thanks to Hummingbird.

If you are interested in reading more about this algorithm, go here.

Why don’t search engines explicitly publish/ announce their algorithms?

Search engines do not want anyone to misuse the algorithms.

Search engines want to keep their user’s experience great therefore, they want only good quality and relevant content to be shown.

Only those who read extensively about search engines and measure a website’s performance on a regular basis are able to notice- whether there has been a change in the existing algorithm or a new algorithm has been introduced.

There are certain signs:-

If there has been a major change in the website’s traffic or change in ranking without making any major changes in the website. Then there are chances that a new update has been introduced.

One can also keep up to date by following Googlesearchliason on Twitter. There are also various external software that monitor changes in webpages. Here you can find more about them.

Keep yourself up to date with exciting blogs on digital marketing explained in easy words by subscribing to my newsletter. I post frequently on LinkedIn as well @ Kajalnehru

11 comments

  1. Very nice post. I just stumbled upon your blog and wanted to say that I’ve really enjoyed browsing your blog posts. In any case I’ll be subscribing to your feed and I hope you write again soon!

  2. I have read your article carefully and I agree with you very much. This has provided a great help for my thesis writing, and I will seriously improve it. However, I don’t know much about a certain place. Can you help me?

Leave a Reply

Your email address will not be published. Required fields are marked *