Google Search Console tutorial for SEO and traffic

Google Webmasters is must use tool for every website owner.This is free tool from Google.It acts as an indirect conversation platform between Google and Website owner.This step by step tutorial will cover

  • Getting started with Google Webmasters
  • Details of features offered by it
  • Testing your site with it
  • Finding out issues of your website using it
  • SEO using Google Webmasters

Getting Started with Google Webmasters

Adding Site to Google Webmasters

This is the first step.If you have already added your website to it then you can skip this part and move on to the next.You can sign up for Google Webmasters using your Gmail user name and password using this link.After creating the account you can see Add a Site button on top right hand side.You will see below pop up after clicking on the button

Google Webmaster Add Website tool

Enter your website URL in the box.I would recommend to enter non www version of your website URL.After that you will be directed to verification page.

Verifying Site with Google Webmasters

You can use any one of the method for verification

  • HTML file upload
  • HTML tag
  • Google Analytics
  • Google Tag Manager

But HTML file upload is the way to go.This is very simple method to implement.You need to download the file from Google Webmasters.After that upload HTML file downloaded from Google Webmasters to your web hosting root directory.
After uploading the file click on verify button on Webmasters page to verify your site.After verifying your website you will be directed to Dashboard.
Now we will cover different options Google Webmasters provide.

Google Webmasters Crawl option

Adding Sitemap to Google Webmaster

Sitemaps are way to inform Google about your website.To add sitemap Click on Sitemap option under Crawl option.
On the right hand side you can see option Add or Test Sitemap.You can use the option to check your sitemap for errors before submitting it to Google.In case you are using WordPress then you can use my Sitemap plugins (image sitemap , mobile sitemap) as well.

Google Webmaster Sitemap tool

On this page after some days you can get details like

  • Number of pages,images,videos submitted by you
  • Number of pages,images,videos indexed by Google
  • Issues with your Sitemap

Verifying your robots.txt file

robots.txt is simple yet powerful tool.It can ruin your SEO efforts if wrongly configured.It is used to block parts of your website from eyes of search engine.You can get idea of URL blocked by robots.txt file using Blocked URLs sub option under Crawl option.Google automatically downloads your robots.txt file.This page shows

  • Status of your robots.txt file.The status should be 200 (Success)
  • Number of URL blocked by robots.txt
  • On this page you will see two text boxes your robots.txt file content and URLs and user agent to test against.
  • In this first box your robots.txt file content is shown and in second box you can mention your domain (
  • Click on the Test button at end of the page to validate your robots.txt file.
  • In case of any issues you can edit contents of robots.txt file on first text box and test the changes.
  • If things are working fine you can copy the contents to your robots.txt file.

Submitting your pages to Google

Google does take time to crawl and index your content after submitting sitemap.You can speed up the process by submitting some of your pages. Google then indexes the page.It also indexes the page linked with the page.To submit your pages to Google you can use Fetch as Google sub option under Crawl option.

Google Webmaster Fetch as Google tool
Google Webmaster Fetch as Google tool

The process is very simple.You need to paste the link and click on Fetch option as shown in above image.You should see Success as status.If not then there is problem with the web page you just fetched.It also helps you find out any potential issue with the page.You have below limits per month

  • 500 URL Fetch as Google
  • 10 URL and linked page submissions
READ  Watermarking your images with free software

Google Webmasters Crawl Errors & Crawl Stats option

Crawl errors lists errors Google found while crawling your website.You should try to minimize the errors.Minimizing the number of crawl errors will increase Google experience while crawling your website.It also increases chances of your website content getting indexed faster.The crawl error section is divided into 3 parts Desktop,Smartphone and Feature Phone.
Crawl stats gives idea about Pages Crawled per day.It is number of pages crawled by Google a steady number if fine.It the number is decreasing with time then it is a worry.You should visit the crawl error section and time spent report to get idea of problem.Also look if your sitemap has some issues.It also indicates Kilobytes downloaded per day and Time spent downloading a page.Time spent downloading a page should be less if high then it means that your website is too slow.Slow websites do hamper crawling process of Google.

Google Webmasters Google Index option

Index Status

It gives number of pages indexed over a period of time.The number should be close to actual number of pages on your website.If it is decreasing on constant basis then there is an issue.You need to take corrective action as mentioned in other sections of this post.

Content Keywords

This section is very important.It lists keywords Google found on your website.It does not list keywords found on any particular page.Instead it lists keywords found on all pages of your website.So you should check this section to see if the keywords you want to rank for is listing here or not.
You can also download whole table and get an idea of keywords your site presents to Google.You can take help of this list of optimize your website of your target keyword.

Google Webmasters Search Appearance option

HTML Improvements

This section is important.It gives you idea about Meta Description,Title tag and Non index able content.

Meta Description

Meta Description should be of 150 characters.Every page should have unique meta description.If there is any issue with Meta description then the issue is reported here.

  • Duplicate Meta Description – Two pages have same meta description
  • Long Meta Description – Meta description is longer than permissible length try to shorten it.Search Engine will truncate longer meta description while showing your content on search engine results page (SERP).So you should optimize your Meta Description in permissible limit of 150 characters.
  • Short Meta Description – Meta description is too short try to increase the length of it but not more than 150 characters.Short Meta descriptions mean you are missing out chance of presenting your content.Meta descriptions along with Titles help visitors decide which search result might answer their query.
READ  Linux certification sample question paper with answer set 3

Title Tag

Issues with title tag are reported in this section.Title should be around 55 characters.If title are too big then search engines truncate them on SERP.SO you should try to create short but interesting titles.Searchers click on search results with Catchy titles and interesting meta description.It also displays issues like Duplicate Title on your website.

Non Indexable Content

Google Webmaster HTML Improvements tool
Google Webmaster HTML Improvements tool

The content on your website Google has difficulties indexing is listed in this section.If there is any entry here then you must take corrective action.

Structured Data

This section lists structured data used on your website.Structured data I would recommend using on your website.In case you are not aware of then read this tutorial. You can also download commonly used tags like Image,Video and Review from the post.After downloading just paste the code on your content and it will work fine.
It is easy to implement and have good benefit.This website uses reports number of structured data found on your website and errors if any.Later in the post I will cover the tool which can be used to test tags before implementing on your website.

Google Webmasters Search Traffic option

Search Queries

This section gives you idea about Google search engine rank of your website for keywords.You can get keywords list for Web Search,Image Search,Mobile Search and Video Search.You can also get combined report of all these searches.It also shows recent improvement or decrease in your Click Through Rate or Search Rankings.Click Through rate is number of times searchers click on your website link in Search Results page.
Ideally improvement in these two parameters (Search Ranking and Click through rate) is important.You can also get idea of keywords for which a particular post is ranking for.You can optimize your post for these keywords based on the results shown on this page.
Since this is very important and requires more detailed explanation.I would write a more detailed post just on this topic.I would update link to the post here.

Links to Your Site

This section gives idea about sites linking to your website.It also lists number of links a particular page on your website have.All links mentioned in this section are external links.Links do play important role in ranking.It is good to have increase in link over the time.

Internal Links

This shows how content on your website is internally linked.The more linked a content on your page is the more important it becomes in eyes of Search Engine.You can use this tool to increase internal links to pages which have less internal links.The internal linking should be uniform.You should not over link a content or under link one.There are exceptions like your Home page.Home page will have most internal links than any other page.
If some content on your website does not have enough internal link then you can consider adding links to them from similar articles.This help bring traffic to your old posts.

Manual Actions

There are good persons on internet as well as bad persons.There are people out there spamming on your website or trying to do it.It lists spams on your website if any.In case there is some listed you should take corrective action.It is best to have this section empty.

READ  Optimize CSS files to speed up your Wordpress website

Google Webmasters Security Issues option

This section lists security issues with your website.If your website is hacked or infected with virus then this section will list it.So you should this section regularly to check if there is any issues or not.Google also notifies with mail as well if your website is compromised.You get the message as mentioned in next point.This section mentions more details about the issue.
If you have any issue listed in this section then immediately take down your website.After that take corrective action to solve the issues.You should not continue to be online with compromised website.This will lessen importance of your website in eyes of search engine.You could lose all your ranking as well.So better is to take down your website and fix the issue.
If your website is penalized by Google for virus issues then you need to submit reconsideration request to Google after taking corrective actions.

Google Webmasters Site Messages option

Site Messages serves as communication platform between you and Google.But it is one way communication not two way.Google sends you updates if

  • Your website is compromised
  • There is any recent changes made in your Site Address or other site settings
  • Google has trouble indexing your website.
  • Any important change made by Google

A copy of mail is send to your email address as well.The mail address with which you have created Webmasters account.

Google Webmasters Structured Data Testing Tool

Remember we discussed about Structured data ( few paragraphs back.Structured data is becoming very important with time.There are many structured data out there but is most accepted one (or in phase to become most accepted one). This tool is also used to test Google plus publisher author and publisher tags.You can use Google Webmasters Structured Data Testing Tool in two ways

  • Testing URL
  • Testing HTML

Testing URL

For example you have implemented tags on your website and published the post.After publishing you can test your entire page to validate the tags and Google plus publisher or author markups.You need to paste the URL of the page.It also shows look of your website on SERP (Search Engine Results Page).

Testing HTML

You can also validate your Structured data HTML code before implementing on your website.It is good practice to validate the tags before implementing.You can verify it and then implement on your website.You just need to paste the HTML code.


Google Webmasters is great tool in hand of Webmasters free of cost.If you are not using it then you should start using it without any delay.If you are using it then make sure you make most out of the tools and reports provided by Google.These free tools are extremely powerful.There are many paid solutions out there.But if you use these tools perfectly or to some extend then you do not need any SEO specialist to give your blog a boost in Google Ranking.
You just have to write good original content and do the basics right as suggested by Google Webmasters or other tools like Bing Webmaster or Yandex Webmasters.Doing these two things right will see your website in good position.

Consider sharing this post in case you found it useful.

3 comments Google Search Console tutorial for SEO and traffic

  1. Sometimes, I receive a 404 error of a page that never exists, like index.asp.

    Then oftentimes, the 404 I get is from pages index.html and some old pages.

    But most of the errors are from pages that never exists.

    Have you experienced this?

    Thank You,

  2. You should keep your old website in search console along with new site and have 301. For sitemap have the new one added and keep old one for some time. This helps Google understand that site has migrated better

  3. Very useful thank you.

    One thing I would add is that Soft 404s often show ecommerce categories with no products. This could be a good way to identify empty pages on a large site as well as giving an indication on if your site is close to the dreaded Panda.

    I found the point you made about old sitemaps. When doing a site migration I tend to just replace the sitemap URL. Do you suggest that I give the sitemap a different URL and let the old one 404 after the redirects have been crawled?

Leave a Reply

Your email address will not be published. Required fields are marked *