So, not everyone has a lot of time to spend on their SEO, so I’m putting together a series of blogs with some quick tips of how to spend 5 minutes productively looking at something that will assist you in your efforts. These might also start you off on an avenue of investigation or help you find a problem with your site that might be limiting its ranking potential.
So the first check I am going to suggest is that you look at your robots.txt file.
- Locate your file – it will be found at http://www.yourdomain.co.uk/robots.txt This is a standard location that the search engine robots will look at to find the file.
- Check the file redirection (normally, it shouldn’t redirect at all, but there are cases where it does and also you need to check it is complying with the rules you have set for www and non-www versions of your site (canonical version). We recently found this on a site, so it isn’t a completely way out scenario.
- Check the content of your file. Check specifically that you don’t have a Disallow: / in it. If you’ve got more complicated rules, you can use Google Webmaster Tools’ robots.txt checker to check the rules are working.
- Check for a sitemap reference. It will look something like
(although the file name and location can vary). Although this isn’t required, it is a useful step to make sure your sitemap is being found and utilised). It’s also a useful step in crossing the t’s and dotting the i’s in SEO terms – you only need to do it once and it is sorted. For more on how sitemaps work see this blog.
This is the first tip in this series, keep your eye out for more on the way!