< Back to Blog

Making the Most out of Google Search Console: Part Two

If you’ve read through Part One of the Vu Online guide to Google Search Console (GSC) you will know how this free tool can be used to find out how good your search results look in the Google search engine page rankings (via the ‘Search Appearance’ section) and how popular your site is with visitors and other sites (via the ‘Search Traffic’ section).

The second part of this article looks more closely at the indexing side of things. The remaining two sections of GSC will reveal information including how easy your site is to crawl and how many pages are being indexed. There are also ways to make the crawling process more efficient by, for example, blocking certain pages or changing the way Google deals with duplicate pages.

Google Index:

There is no point having the best content on the web if Google and the other search engines are unable to index your pages. This section of the GSC enables you to check and fine-tune your site’s indexing.

Index Status:

Here you can see, at a glance, how many pages have been indexed per day over the past year. By selecting the advanced tab, you can also see how many pages have been blocked by your robots.txt file (more on that below) and how many pages have been removed from the results.

Although the actual number of pages indexed will naturally fluctuate, you should be alert to any unexplained drop in this graph.

Blocked Resources:

Even if Google’s bots can visit a page, it may not be able to index it properly if any resources the page uses (e.g. images, scripts, etc.) are being blocked. This section will alert you to any blocked resources and links to advice for solving this issue.

Remove URLs:

If you want to stop Google from indexing a certain page for a limited time, this option allows you to temporarily remove a specific URL. This is a convenient way to fine-tune your site indexing without touching your website. However, if you want to permanently remove a page from search results you should go into your site and add the rel=”nofollow” tag instead.

Crawl:

The fourth section of GSC takes a crawler’s eye view of your website, helping you to spot and react to any problems.

Crawl Errors:

If Google’s crawlers encounter any problems while trying to reach a page or your website as a whole, here is where you will find out about it.

Since the crawler will attempt to connect a number of times before giving up, the Site Errors metric is displayed as a line chart showing the percentage of failed attempts per day. If, for example, the crawler failed on one of five attempts to reach your site, this will appear as a 20% failure rate.

Below this graph is the URL (individual page) Errors metric which includes a graph displaying the number of errors found and a table with further details. Here you can see which error codes were generated when the crawler tried to reach the page (e.g. a 404 error indicates a missing page while a 500 error indicates an internal server error).

Crawl Stats:

This section includes graphs showing the number of pages crawled, the amount of data (in KB) downloaded and the time spent per page. The data are summarised on the right-hand side to reveal high, low and medium figures.

An increase in the number of pages visited can suggest your SEO strategy is bearing fruit.

Fetch as Google:

Do you want to measure how long it takes Google to reach a specific page? Or how that page appears to the crawler and a web visitor? This section allows you to enter a specific page address and to fetch and, optionally, render the page.

This is a good tool if you have been alerted to potential issues with a specific page.

Robots.txt Tester:

Your robots.txt file gives search engine crawlers and other bots instructions about which areas of your website you don’t want to be crawled. This part of the GSC displays the code on your existing robots.txt file and enables you to enter a page address to see whether a specific robot is being blocked.

Sitemaps:

Rather than rely on Google finding and indexing your webpages manually, it is good practice to send a sitemap file which should include all of the files you want to be indexed.

The bar chart on this section shows you how many pages were included on your sitemap and how many of those pages were subsequently indexed by Google.

By clicking on the sitemap link below, you can see any problem pages and take action to fix or remove them.

URL Parameters:

The final option in the ‘Crawl’ section reveals how Google’s crawler prioritises pages with parameters. When users take certain actions on your page, the address may change and this will appear as a duplicate page. Google knows to cluster these pages and choose the main page for indexing. In most cases, Google will make the right decision but you can override the default on this page.

If you think you need to change this behaviour it is probably best to speak to an expert as it is easy to accidentally render your pages invisible.

Other Functions

Before we leave Google Search Console, there are two final sections to explore:

Security Issues:

This section will normally be blank but contains advice if you believe your website has been hacked or otherwise compromised.

Web Tools:

This section provides links to various other Google tools including an ad experience tool, testing tools and the Google Analytics platform.

Hopefully you now have more of an understanding about how Google’s free analytics tools can give you insight into your website’s performance but if you need further advice, please don’t hesitate to contact us.

Online Marketing Essentials

Get your online marketing off the ground with our monthly marketing support service.

If you’re a small business or entrepreneur just starting out, you’ll know the importance of making an impact online. You may have a lovely new VU startup website you want to show off – how do you get the message out there and how do you know what’s working and what’s not?

Read More