Social
X
Organic Search SEO

If I were Google – a logical approach to SEO spam

Tuesday, November 20th, 2012 by Steve Masters tagged , , , , ,

Google capIn the past year, many SEO professionals have been put on the back foot as Google has ripped up the rule book on what constitutes SEO spam. The first big hit was the attack on thin content – content created just to get high ranks from Google in order to earn affiliate revenue from Google’s ad network. Then Google killed off some blog and link networks by removing them from the index. Panda sought to promote quality content, Penguin sought to de-rank sites with bad link practices.

The net result is that SEO can no longer be about getting as many links as possible on as many websites as possible; it can no longer be about writing spurious content on spurious sites in the hope that the links will be counted. Even knowing what we know now since Panguin, we don’t know what other currently legitimate practices Google might soon try to block in its algorithm.

Unless Google tells us exactly how it works and what it’s planning (which is as likely as Boris Johnson is to marry Angela Merkel), we have to work on anecdotal evidence, shared experiences and sensible assumption. Fortunately, we can also apply logic. The logic I apply to Google is, “What would I do if I were Google?” Here is part of my list.

  • Set a benchmark for the number of links any website is likely to gain based on its size. If a website has only ten pages but thousands of backlinks pointing to it, treat that as suspicious.
  • For any backlink, consider how popular the website is that contains the link. If a directory is linking out to lots of sites but there are no links pointing to that directory, consider those links to be worthless.
  • Ignore the ‘nofollow’ instructions on all websites and use my own algorithm to determine whether a link is worth counting or not.
  • Consider how many links a website is achieving over time. If its volume of backlink building outstrips the benchmark for other websites of its size, treat that as supicious.
  • Consider the quality of any article that links to another website – use the author influence, the domain authority and the page popularity – and use that as a judge of how valuable the link is, also comparing relevance between the two pages.
  • Assign less value to links in comments and social media than to links within quality articles on authoritative sites. A newspaper feature, for example, has to pass various editorial tests, whereas a comment is very likely to be posted by a marketing person.
  • Devalue any links on sites that are clearly related to the site being linked to. Also devalue any links posted by people on social media who are the authors of the content.
  • Not worry about duplicate content – it’s bound to happen naturally – but I would show only the best version of that content based on all my other criteria.
  • De-rank sites that clearly exist only to make money from Adsense revenue, where the content lacks genuine human demand or interaction.
  • De-list any site that exists only for SEO and Adsense-revenue generating purposes where there are no links in, no human interaction and no author authority.

Share this article

About the author

Steve Masters

Steve Masters

Steve is Head of Services for Vertical Leap and its sister brands. He started professional life as a magazine journalist, working on music magazines and women's titles before becoming a web editor in 1997, then joining MSN to work purely in online publishing. Since 1999 he has worked for and consulted to a broad range of businesses about their online marketing. Follow on Google Plus and Twitter