Indexation report

The indexation report has two main purposes:

  • Check if your pages are indexable
  • Track if Google has crawled or indexed your pages (SiteGuru Insights only)

This article explains how you can use the indexation report to ensure your pages get indexed by search engines.

Checking the indexation status of your pages

To get your pages to the top of Google, you first need to make sure Google can crawl and index them. The Indexation report alerts you whenever a page cannot be crawled or indexed.

To do this, we're looking at the following settings:

  • The robots.txt file, which can block a page from being crawled by search engines
  • The robots meta tag, which can block a page from being indexed by search engines
  • The robots header, which can also block a page from being indexed
  • The canonical URL, which can instruct search engines to index another URL instead

If there are indexation issues with the page, you'll see it in the indexation report. 

If the page is indexable, it shows up with a green checkmark. If there are any issues, you'll see a warning sign, with an explanation of the issue.

SiteGuru Indexation Report

Pages that should not be indexed

There may be certain pages that you don't want Google to pick up. This could for example be a login page for website administrators. In that case, you can use the noindex header or meta tag to stop the page from being indexed.

If you've verified that a noindex page is intentionally set to noindex, you can click the Ignore Page link. This will remove that page from the report.

The page will also no longer be included in your SEO score, and won't show up in any of the other reports.

Google indexation tracking

If you have connected SiteGuru to the Google Search Console property of a site, we can use Google's Page Inspect API to get the indexation status of your pages in Google.

The Page Inspect API is available to all paying customers, and to all users on the free trial.

If connected, the indexation report also tells you:

  • The current indexation status in Google
  • The last crawl date from the Googlebot

SiteGuru Indexation report with Google Data

If pages are indexable, but for some reason, Google is not crawling or indexing them properly, it is time to do some research. Reasons for this could be:

  • Google doesn't think the content is relevant
  • There are no internal or external links pointing to the page
  • The page is considered a duplicate

Often, the Google indexation status will give you a hint of what's wrong. All the reasons why a page is not being crawled or indexed are described in Google's documentation.

Limitations of the Page Inspect API

The Page Inspect API lets us get the indexation data for up to 200 pages per day. 

If your site has more than 200 pages, we'll get the next 200 pages the next day. This may cause some delay in the status, although in our experience, the Page Inspect API is updated a lot quicker compared to the data you see in Google Search Console.