Recent Posts

SEO: Complete tutorial



#checklist for the SEO of a site

  •  Hosting and domain (preferred .com domains)
  • SSL
  • Robot.txt file 
  • Sitemaps
  • Canonical issues
  • Google search console
  • Google analytics             

 

Prerequisite Terms

  • Crawling

A process in which search engine just like Google or Bing look in or visits a web page. The bots only checks-in for the webpage. Few other names to these bots are Spiders and Crawlers.

  • Google Caching

Google takes the snapshots of webpage and websites stored in its servers. The process occurs regularly and helps the web servers to remotely store those web pages and websites. This process is regarded as Google Caching.

  • Robots.txt

The crawlers or bots need to visit pages, directories and sub-directories of websites. But how do they choose what should be crawled and what should not? Every website contains sensitive information in various directories like cgi-bin which should not be available to the internet. At a same time, there are few pages that must be available to the internet like Blog pages. Robots.txt file solves this purpose. It basically contains a list of various search engine crawler as well as the list of directories present in the root folder of the websites. These crawlers and directories are defined with permissions to allow or disallow crawling.

Examples are:

Robot.txt file for Facebook,disallowing the album directory of users.The robot.txt file of amazon allowing Prime Music sub directory and AmazonMusicUnlimited sub directory

Robots.txt also plays an important role when it comes to Search Engine Optimization.

  • Sitemaps

Unlike robots.txt, Sitemaps solves the purpose of URL inclusion that helps in smart crawling. Sitemaps includes the URL list along with some information related to each URL like the priority of the page to which URL is pointing. It can also have the date of modification. The main objective of including a sitemap is to link all the pages so that crawler visits all of them.

If anyone manually want to link each and every page, they don’t really need a sitemap, But this scenario don`t look convenient for everyone. Also few search engine auditors like Woorank recommend the use of sitemaps.

  •  Indexing

When a webpage or website is crawled for the first time and Google cache takes a screenshot and that screenshot is stored in a Google database. Again remember, database only store the screenshots of the page.

  • Ranking & Retrieve

The ranking and retrieval process is responsible for showing the most appropriate content. Whenever user types any query, the Google process it in such a way that the pages with most relevant results appear on the first few pages. These results follow various Google search algorithms. Each and every page submitted to Google webmasters get a rank according to which they appear on search engine.


________________________________________________________________________


All Google search algorithm updates: Everything you need to know

Google search algorithm governs the ranking of pages. The search algorithm ensure the pages ranked in such a way that only relevant content is available to user. How Google uses the search algorithm to rank pages?  In this article I will try to answer all your questions.

What is Google search algorithm?

All the major Google updates are discussed below

1. Brandy Google update: Released in 2004

The major Google search algorithm update typically focusing on the concept of latent semantic analysis. This update became a turning point as no ranking algorithm got affected but it just modified the index of Google. With this change, Google tried to shift the focus on intent. The dynamically generated links ranked better. The anchor text became an important parameter. These links helped Google to look at all other pages of a website for evaluating if those pages also contain the queries or search terms.

For example, Let us suppose a 3 page website has pages named Home, About & contact of which only homepage is indexed. Now Home page has a anchor text for the site name which is also a keyword and that anchor is an internal link to about us page. With a brandy update, this homepage which is indexed, will help search engine to know about the ‘about’ page as well.

2. Panda Google update: Released in 2011

The panda update helped search engine to rank pages according to their quality. Panda update affected a large number of websites (more than 12%). Basically Panda tries to give a quality score to websites. Till 2015 it was just an update but in 2016, Google officially incorporated it into its algorithm. Any website containing duplicate or thin content ranked lower. On the other hand, a website with unique and relevant content is ranked higher.

Panda also resulted in sudden increase in number of complaints due to copyright infringement because the original content was getting ranked lower than relevant ones. Also it became easy for Google to identify the spammers. With panda update, the intent based quality searching revolutionised SEO.

Duplicate content, Thin content, Inbound links, Quality of content, The ratio of inbound links to that of queries all these factors play an important role in ranking.

3. Penguin Google Algorithm: Released in 2012

Penguin officially became a part of core algorithm of Google in 2017, with an objective to identify websites with Black-Hat SEO practices and rank them low. When penguin came out, more than 3% websites on the internet were affected. The practices targeted were mainly paid backlinks and keyword stuffing.

With penguin, the inbound links coming from unrelated low quality websites became the major factor for triggers. The objective of penguin was to make websites follow search engine webmaster guidelines. If in case a website fails to do so, it must be penalized.

4. Hummingbird Google update: Released in 2013

Hummingbird came as a personal search algorithm, which is completely based on user intent. With hummingbird, Google aimed to give results that is based on intention of users. Also it introduced the concept of local search, knowledge graph and voice search. Let us discuss the problem before hummingbird.

Suppose you type a keyword ‘movie’, the results of this query are exactly the same as of another keyword ‘movie making course’. There arises a huge problem of intent to which hummingbird became a solution.

Local SEO is location based SEO that uses ‘Google My Business’ results to show SERPs for any nearby location. Due to local SEO, user may see different result for a same query. For example 2 User, one in central Delhi and other in west Delhi surely get different results for same query like ‘Clubs near me’.

Knowledge graph is like a graph based detailed fact page for any company or business or any query.

Also with voice search, Google adopted NLP (Natural Language Processing) technique to improve the search engine to a huge extent.

5. Pigeon Google update: Released in 2014

Since local SEO already came out with hummingbird, Pigeon updates aimed to improve the local search. So when it comes to local search, Google My Business has a role to play. So, Pigeon tries to display the SERPs on the basis of location and distance i.e. the nearest Google My Business Page results. The pigeon algorithm impacted local businesses to a huge extent. Now people were using directions to find businesses which boomed many small and large scale businesses.

6. Mobile Google Algorithm: Released in 2015

Mobile update named Mobilegeddon. With Mobile first algorithm update, Google ensured that mobile friendly pages should rank better than non-mobile friendly pages. With digitalisation Google knows that more than 60% of people are using it on mobile devices. So when it comes to relevant results, SERPs should be displaying the result pages optimized for mobile device so that users can have a better experience. Also it affected search rankings only for mobile devices in such a way that instead of a complete website, web pages should only be affected. This update increased a demand for AMP pages enabled websites. AMP is Accelerated Mobile Pages.

7. RankBrain Google Algorithm: Released in 2015

Google completely adopted Machine Learning with RankBrain algorithm. The quality of content got a new metrics which later became the most important focus areas. Even Google consider RankBrain as the 3rd most essential criteria to rank pages as it aims at relevancy. UI experiences, call to actions and retention & bounce rates helped Google to rank SERPs accordingly. Suppose a query ‘How to make money online’ brought 40 SERPs, among them the first few results would be those with best CTRs, low bounce rate and high retention time. (CTR is click-through rate. The clicks per impression metrics) 

Conclusion: RankBrain is an reputation based algorithm. According to which the most reputed sites are ranked higher.

8. Possum Google update: Released in 2016

Possum is also a location based algorithm but with an exception. It filter the results. Since local search mostly deals with Google My Business Pages. There comes a twist, a shopping complex have more than 10 businesses of same domain. Which businesses should appear? Possum tried to solve the issue by optimising the results.

The issue arises as only 2-3 of them are visible which may or may not be the best results for user. Now when we talk about the exception, It not only optimise the nearby results but also helps to give SERPs for different physical locations. If we discuss about the situation before possum, Custom physical locations didn’t showed results of nearby Google My Business Pages. Possum made it possible. It improved the location based SEO results.

9. Fred Google update: Released in 2017

Fred is one of those updates which basically tried to target the false practices for aggressive monetization benefits. These false practices include a huge number of advertisements, low-quality content, redirections, affiliate without value driven content etc.

Actually by 2017, Google realized that Blogging is dominating the market in terms of revenue generation which is why more people are shifting to it. There has to be a certain criteria to help quality bloggers to gain and black hat SEO tricksters should be penalized.

10. BERT Google update: Released in 2019

BERT Bidirectional Encoder Representations from Transformers, an update that affected a huge number of around 10% results. This is regarded as one of the most dominant update after RankBrain. Since Natural Language Processing was adopted by Google, with this update Google tried to implement Artificial intelligence in a more improved manner. The objective was to understand the context of queries which depends on the type of phrase searched.

For example let us consider a user searches “How to make a business plan?” and then  “how business plans can be made effectively?”

To this example, Google with BERT will have a more relevant result to both these queries. And the fact is, even after being enough relatable, search engine will try to understand the context to what a user is searching along with its intention.

Therefore we can say that the focus which turned from keywords to intent was now shifted to context with intent.

______________________________________________________________________

Best Free Tool for keyword research: Google Keyword Planner:

GKP is the best free tool for creating your list of keywords. So let's use it to get a few keywords.

Steps

  • Type ‘Google Keyword Planner’ in your browser to search.
  • Follow the first link and when the result page loads, click on ‘Go to Keyword Planner Button’.
  • Choose an Ad account to continue with.
  • When you enter the GKP, you will see two options.
  • Discover new keywords
  • Get search volume & Forecast.
  • Basically we use GKP in that order. We first try to get some ideas and then their forecast.
  • Click on discover new keywords. Enter the products or services along with the location and click on Get results.
  • The results are shown for the queries we provided and the ideas related to them.
  • An important thing to understand about the results shown here, Average Monthly searches may hamper your expectations. Because these results aren’t the exact numbers. For example, ‘IPL’ the term shows optimum impressions (Let’s say 100K or more) during only 2 months in 12 months. But Google will show the average between (100K-1M) for those impressions. Or Google may show the highest or best search volume which may not be the exact result of a particular keyword.
  • Download the list of ideas if you want.
  • Now let us try to check the forecast of a few keywords. Directly check it by clicking on keywords in the sidebar or Select Get Search Volume and forecast this time after loading the keyword planner. From the results, you can choose keywords for ad campaigns as well. 

Every Business or digital marketer has their own strategy to choose keywords depending upon their goal and budget. Goals vary from increasing traffic, Getting more downloads to converting more. So I am not disclosing my own strategy to choose keywords because it may stop your mind from doing its own research and exploring or discovering its own strategy.

Hacks for GKP

There are few unknown or less popular hacks that can help you to get the best out of GKP. To know about them, Comment below about that. If we get a good response and comments from interested people, we will publish those hacks and other tools soon in our upcoming blogs.

________________________________________________________________________

Technical SEO | What are the technical requirements for SEO

steps to do technical seo

Technical SEO deals with the technical requirements needed to be fulfilled for better rankings. In technical search engine optimization techniques, we try to address the following issues related to any website:

  • Security of the website.
  • Crawling & Indexing of the website
  • Linking and traversing
  • Setting up Analytics of the website
  • Accessibility and User Interface

Prerequisites to get started with technical SEO

  • A website consists of one or more webpages designed and developed by anyone. It may consist of texts, images, videos, audio links. A professional coder always prefers to code for a website. But not everyone is a coder, so for the people which non-technical background, we have something known as CMS i.e. Content Management System. CMS enables the end-users to make websites without code or with minimal code. WordPress is one such example, also it is the most popular CMS.
  • Now along with a website, we need two more things, if we wish to have our website on the internet. Domain and Hosting.
  • Domain:  The name of a website like computerservicesolutions.in. You need to purchase this name from companies like BigRock, GoDaddy, etc.
  • Hosting: To understand hosting, let’s understand how the internet works. So whenever we wish to find something on the internet, we type on our search engine. The search engine acts as a matchmaker, it connects us with a computer which holds the information we require. Remember, this computer is connected to the network of networks, known as the internet. Also, the computer is working 24×7 i.e. is never turned off.
  • Similar in our case, for websites, we need to upload our website to such computers across the world. We have two different types of hosting conventionally.

Note: Remember always, the computer requesting some information, known as client.

The system fulfilling any requests, known as a server

Types of Hosting

  • Shared: We host our websites on a server that hosts more than one website. All the resources are shared among the websites. It sometimes increases the request fulfillment time.  Shared hosting is economical and costs around ₹4000-₹5000 per year.
  • Dedicated hosting: A bit expensive than shared hosting, A hosting in which a server hosts only one website. All the resources are allotted to only one website, which makes it fast. Due to this, the request fulfillment time is very low.
  • In today’s world, we have a new type of hosting called cloud hosting. In cloud hosting, multiple servers interconnected together host a website. It is expensive, fast, and secure.
  • Now once we have all these things available and connected, we are ready to go. Remember, at first you need to upload your website and database files to a hosting server. And connect your domain to the website. Now if you purchased hosting and domain together from the same vendor, it’s automatically connected. But if not, you can connect them by redirecting the domain name server locations to your hosting servers by simply adding the server IPs to nameserver in the DNS settings.

So, let’s start with our technical SEO.

Security of the website

When we say security of the website for SEO, we talk about the term encryption, confidentiality, and SSL. Although there are many security issues but for SEO they are not relevant to this subject.

  • Encryption: To prevent unauthorized access, Data traveling between the sender and receiver i.e. client and server, the message gets converted into code using some keys, known as private or public keys. 
  • Confidentiality: Confidentiality is directly related to the privacy of the data. The aim is to keep the communication confidential so that no outsider or unauthorized person can use it for any unfair practice.
  • SSL: SSL, Secure Socket Layer certificates are very important for technical SEO, To implement encryption to meet confidentiality, we need to deploy SSL on our website.  The SSL aims to fulfill the purpose of secure communication over a network and to meet confidentiality. You might have seen a lock and green signal on the navigation bar of websites like Facebook. The green lock ensures that website equipped with SSL. So the security deals with the deployment of SSL on a website.

Steps to do this

  1. If you have enough money to spend, you can buy SSL from your hosting service provider. It is the most simple way, you just need to toggle the SSL button after you have purchased. You can also buy SSL from your domain registrar. But then you need to install it on your server.

  2. If you don’t want to spend money, you can still get SSL for free. An organization named Let’s Encrypt provides free SSL.

  3. Remember this: If you are using WordPress, with any plugin or manually, you need to renew it every 90 days. Pro features can do it automatically but it requires a pro license for plugins. AWS can do it automatically.

  4. The first step to manually install SSL to your server is to generate 3 certificate files. For WordPress, you can use a plugin named. It will automatically generate files from let’s encrypt. 

  5. In case, you want to do it without WordPress, here are the steps. 

  6. Go to zerossl.com

  7. Enter your domain.

  8. Verify your domain using any given method. You just need to copy a text in a new file and upload it in the root directory of your server.

  9. A certificate will be generated after verification. With 3 files (crt, key, cabundle).

  10. Login to the Cpanel of your hosting and go to SSL. 

  11. Choose a custom SSL and paste the text written in those three files here.

  12. Click install and activate it after a successful installation

Remember: Force https is a good practice for technical SEO. You can do it directly from your Cpanel if it enables you to do it. Or can set up rules to force https in a file and upload it in the root directory of your server. 

Rules

RewriteEngine On

RewriteCond %{HTTPS} off

RewriteRule ^(.*)$ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]

 

HTACCESS FILE IN FILE MANAGER.

It requires you to create a file named .htaccess. There might be a 100% possibility that the .htaccess file exists in your server but hidden. You can edit that as well. And that’s it, we deployed SSL successfully.

Crawling & Indexing of the website

To ensure Crawling and Indexing of a website on google search engine, we need to submit our website to google webmaster. The webmaster is now known as Google search console. To do so, follow these steps:

  • Go to google search console.
  • Login using a Google account.
  • Click on add property.

You will see two methods, add using domain or URL. Choose any method to add property. I recommend using the URL method.

Select verification method

Type the URL. Remember, we have deployed SSL. So use https://<website_name>

Now You need to verify the ownership of your website.

You can do this via any given method, I recommend using the HTML verification file method, You just need to upload an HTML file to a root directory of the website. You can also use the Tag verification method in which you just need to place a tag in the head of your index file.

Once you verify the website domain, Submission completed. Google will automatically crawl and index it if there is no error.

Linking and traversing of website

The process deals with creating a sitemap & robots.txt file and submitting it to the search console. You can use automatic sitemap creators available on google. For WordPress I recommend going with the plugin named Google XML sitemaps. Once a sitemap is created. Follow these steps:

  1. Go to search console and login to your property dashboard

  2. Click on sitemaps in the sidebar

  3. Now enter the url of your sitemap and submit.

  4. Google will crawl automatically.

Robots.txt

  • Identify the directories you don’t want google to crawl.
  • Type on google for robot.txt generator.
  • Generate on any tool by submitting your URL.
  • Now download and edit the file in such a way that it allows only required directories to be allowed to crawl.
  • Give your robot.txt file a link to your sitemap as well.
  • Now after saving the file, Upload on your website root directory.
  • Type robot.txt on google search console and go to old search console. You can submit your robot.txt file or directly fetch from the root directory by typing the URL of the robot file and test it.
  • To know more about what are sitemaps and robot.txt files, Click here

Setting up analytics

Google Analytics is itself a complete topic so instead of discussing it in a blog we are just focusing on the steps to setup.

Steps are as follows:

  • Go to google analytics and click on add property.
  • Type your details like website address and select the category of your website.
  • Submit and you will get the tracking code, put this to the head section of your index file.

Accessibility and user experience for Technical SEO

Accessibility and user experience deals with the experience of a user which depends on various factors:

  1. Fast loading time of the website. Typically the websites should load in 2 to 5 seconds.

  2. All locations must be accessible i.e. proper navigation mechanism for a website.

  3. A good color scheme with only 2-3 colors.

  4. Interactive pages.

  5. Call to action should be placed on idea locations. Etc.

At last, If you want to check the SEO score of the website you can refer to seositecheckup or neilpatel.com or seobility. These websites also say guide you to improve your SEO score

No comments

If you have any doubts, Please let me know