Social
X
How-To

SEO Tip: Review your site in Google Webmaster Tools

Wednesday, June 26th, 2013 by Emily Mace

Google Webmaster Tools logoGoogle Webmaster Tools (GWMT) has a lot of useful information in it which can help webmasters and SEOs alike review and improve their sites.  In this blog I’m going to highlight some of the more useful reports in GWMT and discuss how these can be used.

Health Menu

Crawl Errors

The Crawl Errors report shows you any issues that Google has found on your site, including Server Errors, Access Denied Errors and Not Found Errors.  These errors are both from links on your site but also from external links pointing to your site.  This report can help you correct any issues on your website and can also be used to approach other websites who have links pointing to your site which are broken – with the aim of getting these fixed too.

Blocked URLs

This report shows you your robots.txt file and allows you to make sure this is working correctly. If you are planning any major changes to your robots.txt file it’s a good idea to test them using this report as you can cut and paste your new robots.txt file into this screen and then test specific URLs against your new code.

This allows you to make sure you are blocking the right things and not stopping Google from seeing important content on your site.

Fetch as Googlebot

This report can help you see if the code on a specific page of your website is working and that the content can be read by Google. This can be useful if you have content on your site which is in a tabbed format, expanding content block or in a scroll box – allaying any fears about the search engine being able to read it.  You can also use this to make sure that Google can actually read all of your code. For example if your site has a JavaScript menu you can check if this menu will be crawlable by Google.

You can also use this report to check to see if things like on-page JavaScript or ViewState code is making your HTML pages too long for Google to read – Google only reads the first 100k of code on a page.

Once you have checked a page as the Googlebot can see it, you can submit the page to Google’s index on this report.  You are entitled to crawl 499 and submit ten pages to Google’s index using this tool.

Malware

If you have received a malware report from Google you can view details in this report, which can help you to resolve issues and request a malware review once your site clean-up has been completed.

Traffic Menu

Search Queries

This report shows you the search terms which your site is showing up for and those for which you are getting clicks from the search results. You can also see the average position of your keywords in this report. In addition, you can see the impression and click through data for specific pages on your site.

This data can be useful for identifying changes which need to be made to your site in terms of content, optimisation or off-page work to help with the rankings, impressions and click through rate – for both pages and keywords.

Links to Your Site

Although this isn’t a full list of links to your website, this report shows you the biggest sites linking to you and can be used to keep an eye on the quality of these links.  You can also use this report if you have received an unnatural links warning in conjunction with the backlink data from other sites, such as Open Site Explorer, Majestic SEO and AHrefs.

Optimization Menu

Sitemaps

This report allows you to submit to Google your sitemap.xml files to help pages get indexed. You can submit as many sitemaps as you have to in this section, so if you have a large ecommerce site and have different sitemaps for different product types you can submit all of these to Google. You can also include your Google News sitemap or image sitemap in this report.

When viewing the sitemap data you can see how many of the pages from your sitemap have been indexed compared to the total number of pages which are in your sitemap.  You can also see any issues in the sitemaps here, such as pages which are included but which are blocked by the robots.txt, broken links and URLs which redirect.

Remove URLs

Using this report you can request removal of URLs on your website which are indexed when you don’t want them to be. To use this you need to have your robots.txt file set up to block these pages. Removal requests like this can be helpful if pages blocked in your robots.txt file are indexed and show a message in the SERPs which reads “A description for this result is not available because of this site’s robots.txt”.

Removal requests of this nature take a couple of hours to be processed and you can check the status of this using the report, which allows you to see pending, denied and removed URLs.

HTML Improvements

This report allows you to see pages on your site which have duplicate or missing title tags and meta descriptions which can help you monitor your site.  All pages on the site should have unique title tags and meta descriptions, so this report can help you to spot any issues with pages.

I’ve also found that this report is useful for spotting duplicate content issues on a site caused by things like search result sort orders, printer friendly pages or duplicates caused by your CMS showing /index pages on root folders. This can be useful for making sure you are blocking the right pages in your robots.txt file.

This report also highlights any pages which are non indexable on your site, so this is another way you can sort out issues with this kind of content and get all your content shown in Google’s index.

Structured Data

This report allows you to see the pages on your site which are using the structured data code to highlight things like addresses, reviews or products.  This can be useful to make sure you are using the system correctly and on all the pages needed.  Each type of structured data is displayed in a group and you can click on the type of data to see a list of URLs which are using this code.

Using structured data correctly can be useful for getting your site to display correctly in the SERPs and to highlight to potential visitors your reviews and other content.

Share this article

About the author

Emily Mace

Emily Mace

Emily joined Vertical Leap in 2008 and is now the Senior SEO Campaign Delivery Manager. Emily previously worked in training, IT Support, Website development as well as SEO and worked for local Government departments and Tourism South East. Emily gained Google Analytics Individual Qualification in 2011, and regularly blogs on the technical aspects of SEO, sharing her expertise with our readers. Follow on Google+