So, not everyone has a lot of time to spend on their SEO, so I’m putting together a series of blogs with some quick tips of how to spend 5 minutes productively looking at something that will assist you in your efforts. These might also start you off on an avenue of investigation or help you find a problem with your site that might be limiting its ranking potential.
So the first check I am going to suggest is that you look at your robots.txt file.
- Locate your file – it will be found at http://www.yourdomain.co.uk/robots.txt This is a standard location that the search engine robots will look at to find the file.
- Check the file redirection (normally, it shouldn’t redirect at all, but there are cases where it does and also you need to check it is complying with the rules you have set for www and non-www versions of your site (canonical version). We recently found this on a site, so it isn’t a completely way out scenario.
- Check the content of your file. Check specifically that you don’t have a Disallow: / in it. If you’ve got more complicated rules, you can use Google Webmaster Tools’ robots.txt checker to check the rules are working. You can add into this tool a number of URLs to check against so you can see if it’s working correctly for different pages. If there is an issue reaching or reading your robots.txt file this tool will also show this information for you so you can resolve any issues with redirects or formatting of the file stopping it from being read. Whilst reviewing this section make sure all the areas of your site you don’t want Google to see are blocked BUT that you’ve not blocked anything really important from being seen. Many times we’ve seen people accidentally blocking their whole site in the robots.txt file because they didn’t run these checks on the file before it went live.
- Check for a sitemap reference. It will look something like
(although the file name and location can vary). Although this isn’t required, it is a useful step to make sure your sitemap is being found and utilised). It’s also a useful step in crossing the t’s and dotting the i’s in SEO terms – you only need to do it once and it is sorted. Don’t forget to submit your sitemap.xml file to Google using GWT as well though, as this can also help in spotting issues with the content in your sitemap file. If you have more than one sitemap file (as you have lots of products, a video sitemap or something similar) you can add additional references to each of these files in the robots.txt file.
This is the first tip in this series, keep your eye out for more on the way! Although this might be quite technical making sure your developers have run these checks can save you a lot of headaches, without costing you that much in terms of time or effort.