Google Webmaster Tools is an invaluable resource for site owners and search professionals alike; providing all sorts of useful data and reports to help you improve visibility in Google’s search results.
Given that we are all striving to achieve results in the ever-changing search landscape it surprises me that the vast majority of accounts I start work on have many issues which are straightforward to fix and will ultimately make your site look better in the eyes of the big G.
We rarely get such detailed information ‘straight from the horse’s mouth’ so in this blog I want to get back to basics and look at how this data can be used to complete a site audit, helping to whip your site into shape.
How healthy is my website?
Looking at the crawl errors is a good place to start. Located in the ‘Health’ section you will find an overview of total errors spanning the past 3 months, as well as a breakdown of the different errors currently affecting your site – hopefully you won’t find too many!
Not Found errors are some of the most common and this is particularly relevant for large ecommerce sites which can easily accumulate large numbers of errors from simple oversights such as handling of stock and discontinued products
This is important as it directly affects user experience so make sure you review the breakdown of affected URLs, make the necessary changes and ‘mark as fixed’.
Can Google crawl my site?
The crawl stats located in the same section are also helpful as this gives you an idea of how effectively Googlebot can crawl your site. Any significant drop in the number of pages crawled daily for example indicates that there may be issues for you to address on-site.
Review your Index Status to see how many URLs are included in Google’s search index. As per the crawl stats, you can quickly establish if there are issues affecting the visibility of the site by viewing this data from the past 12 months. The advanced tab enables you to drill down to a greater extent; with data on the total number of URLs Google has ever crawled, as well as pages blocked by your robots.txt and/or those which have been removed.
Who links to my site?
Now there are plenty more advanced tools which you can use to analyse the links to your site, but the data located in the Traffic section of the site shouldn’t be overlooked.
I find the summary of information useful to quickly establish if there is anything to be concerned about as far the back link profile is concerned. Similarly, the Internal Links data is a simple, yet effective way of identifying the pages with the most internal links and those which perhaps need a bit more love.
Is my site optimised for Google?
Unsurprisingly, some of the most important Webmaster Tools health-check areas can be found in the Optimisation section.
Sitemaps is where you should submit your xml sitemap directly to Google and once processed, the accompanying stats can be used to deal with any errors and ensure that the URLs you have presented are accurate and complete. This is effectively the blueprint to your website so it really matters.
The HTML Improvements section offers more quick wins; highlighting any pages where title tags and meta descriptions are either missing or don’t meet the required character limit.
Of course, you can’t tell if the meta data that’s present is over-optimised, but once again this comes back to user experience and the quality of your site. Time should be invested in making sure this data is highly relevant to what your audience is actually searching for – don’t be a slave to the keywords YOU think are most important.
This summarises just a fraction of the data at your disposal and the automated ‘Messages’ section will alert you to any serious issues or changes, but it’s always wise to keep a close eye on the latest site performance. And while you’re at it, Bing Webmaster Tools isn’t too shabby either!