The URL inspection tool enables you to view how Google is seeing the pages on your website live and monitor the technical changes you’re doing at will.
Use the URL Inspection Tool
The first option with the URL inspection tool is to check individually if your web pages are indexed by Google. You simply enter the URL in the search console inspection tool and it will tell you the status of your page.
Go to Google Search Console and type the URL that you want to inspect.
Let’s discover all of the option available with the URL inspection tool:
Check Indexation Status
If it says:
- URL is on Google: page is indexed.
- URL is not on Google: page is not indexed.
- URL is not on Google: <error-status>: page is not indexed because of an issue.
- URL is on Google, but has issues: page is valid and indexed but has a warning.
Look at the Coverage block to see details about the indexing report.
Test Live URL
The test live URL feature in GSC is very useful to fetch the page live and see what Google sees when it fetches it.
Can be used to:
- Diagnose rendering issues (although Martin Splitt suggests to use the Rich Result Testing Tool for that matter);
- Check the indexability status of a URL at a given time;
- Check page loading issues;
- And more…
View crawled page
You have at your disposition the option to see how the page was crawled by Google. You can switch between HTML, screenshots of the page, and more relevant information like the type of content, the HTTP response, and the page resources.
A great option is to ask Google to index your page. To do so, search for the URL in the URL inspection tool and click on “Request Indexing”.
Coverage of the URL
Right after the inspection results, there’s a section called “Coverage” where you know the status of this page (if it’s well-indexed and crawled) among other things.
It reports on:
- Discovery: How the URL was discovered
- Crawl: Crawl stats about the last time Googlebot crawled the page
- Indexing: What is the canonical URL chosen for the URL
The section sitemaps is straightforward; it’s simply showing you which XML sitemap is linked to this URL that you just searched.
The referring page section right below sitemaps is a way to show you more how Google bots are crawling/indexing your website. This section is telling you that Google might have used this page to find the URL you just searched. Note that if you can get a message saying that Google might have found your page from other ways than a sitemap or a referring page, but is unable to display the information at the moment.
The last time this individual URL was crawled by Google.
The type of crawl robot that crawled this URL
This is telling you if the robots.txt file is allowing the Google bots to crawl this URL.
With this option, you can know if Google was able to get the page or not from your server.
If your page is disabling indexing on Google search results. If your page is blocked by robots.txt, the indexing allowed metric will be set up as “Yes” because you can’t see and respect the noindex directive.
The URL has been declared as canonical by Google.
The page that Google selected as the canonical (authoritative) URL, when it found similar or duplicate pages on your site.
To finish this section, there’s a part called enhancement that is showing you detected enhancements by Google. There’s a lot of enhancement available and you can check them on the Search Console documentation.
Other Articles in the Series on Google Search Console
- How to validate your site in GSC
- How to use the Performance Report in Google Search Console
- How to use the index coverage report
- How to use the URL Inspection Tool
- How to use the Crawl Stats Report
- How to use (and not use) URL Removal tool in GSC (Coming soon…)
- How to use URL Parameters Tools Without Killing Your Site (Coming soon…)
- Google Search Console Use Cases (Coming soon…)
- Regular Expressions (RegEx) in Google Search Console
Now that you know all of the available options of the URL inspection tool, you can make sure your most important URLs are indexed, your crawl budget is well spent, and stay aware of possible enhancements.
Tech marketer, SEO, and remote work advocate, Gab is the CMO at V2 Cloud. Living in Quebec City, he’s a big fan of hiking, traveling, gins, and telling bad jokes.