Google Webmaster Tools (GWMT) has a lot of useful information in it which can help webmasters and SEOs alike review and improve their sites. In this blog I’m going to highlight some of the more useful reports in GWMT and discuss how these can be used.
The Crawl Errors report shows you any issues that Google has found on your site, including Server Errors, Access Denied Errors and Not Found Errors. These errors are both from links on your site but also from external links pointing to your site. This report can help you correct any issues on your website and can also be used to approach other websites who have links pointing to your site which are broken – with the aim of getting these fixed too.
This report shows you your robots.txt file and allows you to make sure this is working correctly. If you are planning any major changes to your robots.txt file it’s a good idea to test them using this report as you can cut and paste your new robots.txt file into this screen and then test specific URLs against your new code.
This allows you to make sure you are blocking the right things and not stopping Google from seeing important content on your site.
Fetch as Googlebot
Once you have checked a page as the Googlebot can see it, you can submit the page to Google’s index on this report. You are entitled to crawl 499 and submit ten pages to Google’s index using this tool.
If you have received a malware report from Google you can view details in this report, which can help you to resolve issues and request a malware review once your site clean-up has been completed.
This report shows you the search terms which your site is showing up for and those for which you are getting clicks from the search results. You can also see the average position of your keywords in this report. In addition, you can see the impression and click through data for specific pages on your site.
This data can be useful for identifying changes which need to be made to your site in terms of content, optimisation or off-page work to help with the rankings, impressions and click through rate – for both pages and keywords.
Links to Your Site
Although this isn’t a full list of links to your website, this report shows you the biggest sites linking to you and can be used to keep an eye on the quality of these links. You can also use this report if you have received an unnatural links warning in conjunction with the backlink data from other sites, such as Open Site Explorer, Majestic SEO and AHrefs.
This report allows you to submit to Google your sitemap.xml files to help pages get indexed. You can submit as many sitemaps as you have to in this section, so if you have a large ecommerce site and have different sitemaps for different product types you can submit all of these to Google. You can also include your Google News sitemap or image sitemap in this report.
When viewing the sitemap data you can see how many of the pages from your sitemap have been indexed compared to the total number of pages which are in your sitemap. You can also see any issues in the sitemaps here, such as pages which are included but which are blocked by the robots.txt, broken links and URLs which redirect.
Using this report you can request removal of URLs on your website which are indexed when you don’t want them to be. To use this you need to have your robots.txt file set up to block these pages. Removal requests like this can be helpful if pages blocked in your robots.txt file are indexed and show a message in the SERPs which reads “A description for this result is not available because of this site’s robots.txt”.
Removal requests of this nature take a couple of hours to be processed and you can check the status of this using the report, which allows you to see pending, denied and removed URLs.
This report allows you to see pages on your site which have duplicate or missing title tags and meta descriptions which can help you monitor your site. All pages on the site should have unique title tags and meta descriptions, so this report can help you to spot any issues with pages.
I’ve also found that this report is useful for spotting duplicate content issues on a site caused by things like search result sort orders, printer friendly pages or duplicates caused by your CMS showing /index pages on root folders. This can be useful for making sure you are blocking the right pages in your robots.txt file.
This report also highlights any pages which are non indexable on your site, so this is another way you can sort out issues with this kind of content and get all your content shown in Google’s index.
This report allows you to see the pages on your site which are using the structured data code to highlight things like addresses, reviews or products. This can be useful to make sure you are using the system correctly and on all the pages needed. Each type of structured data is displayed in a group and you can click on the type of data to see a list of URLs which are using this code.
Using structured data correctly can be useful for getting your site to display correctly in the SERPs and to highlight to potential visitors your reviews and other content.