SEO has changed a lot over the past 10 years and even more so since Google started taking control of its rankings by releasing its first named algorithm update, Boston, back in 2003. This signified what was to come.
With algorithm updates like Cassandra – tackling link quality issues – to major infrastructure changes such as Hummingbird, why does Google take steps to continue advancing the algorithm?
Today, digital agencies and companies alike adhere to the guidelines that Google has set out for all to see, but do they really know why these guidelines exist?
The beginnings, backrubs, backlinks and Google
In 1996 two computer grad students, Larry Page and Sergey Brin, created a search engine called BackRub, so called because it was able to analyse ‘backlinks’ – something that had never been done before. The following year they decided that BackRub needed a new name and, hey presto, Google was born.
Binning the spam
Before Google began to regularly update the algorithm, achieving a good ranking for any search query meant getting as many links as possible – ignoring quality or relevance. However, Google only wanted to show valuable and relevant results to the searcher, whereas most websites just wanted to be at the top of search results. So, spamming became rife.
Suddenly Google had its hands full of shady looking results pages and a massively exploitable algorithm. Black-hat SEO technicians would stuff key phrases into links on thousands of sites in order to fool the search engines – which assigned authority to the websites for those phrases.
In order to combat this, in 2003 Google released its first major shake-up algorithm – Florida. This was the update that put SEO on the map and caused shockwaves throughout the industry. Websites lost rankings overnight and spammy tactics suddenly became irrelevant. This didn’t solve the problem for long though. In response, people discovered that tactics such as buying links in bulk would heal the wounds left by Florida and propel them back up the rankings.
Once again, in 2005, Google updated its algorithm to target low quality links, link farms and paid links. It also introduced the ‘nofollow’ tag – allowing websites to provide useful links without manipulating the search results.
Penguins, Pandas and panic
The first Panda update was released in February 2011, targeting sites that contained low quality, or thin, content. By the end of 2011 there had been nine iterations of the Panda algorithm – all forcing website owners to beef up the quality of content.
In another turn of events, Google released its second major blow to the spam industry with the Penguin algorithm in April 2012. Penguin was specifically targeting over-optimisation – this meant that keyword stuffing, link schemes and various other web-spam tactics became effectively useless. Google’s head of web spam, Matt Cutts, remarked:
“The goal of many of our ranking changes is to help searchers find sites that provide a great user experience and fulfill their information needs. We also want the ‘good guys’ making great sites for users, not just algorithms, to see their effort rewarded.”
Since inception, both algorithms have had multiple updates spaced out over the past two years. Penguin and Panda have allowed Google to fight back more effectively and efficiently than ever. Today most search results on Google achieve their positions because they provide authoritative content that people link to and share.
Avoiding penalties shouldn’t be your goal, however. Providing valuable information for your users is the goal. Think about what they would expect to see and how they expect to engage with your website. Similarly you need to ensure that any websites that you approach for a link are of good quality, relevance and are just all round a good site. Look out for low page rank, spam comments and irrelevant content. If you approach SEO with common sense and patience, then penalties shouldn’t be a worry of yours. You can find the answers to any more questions you may have, here.