Social
X
Organic Search SEO

A year in SEO – 2014 round up

Tuesday, December 16th, 2014 by Dan Callis

2014 das alte Jahr2014, like most recent years, was a busy one for SEO-related updates and news.

SEO is a constantly evolving industry in almost every aspect. As user behaviour, Google guidelines, new strategies and available resources continue to change, it’s our job as a digital agency to keep our finger on the pulse of what’s relevant for success and what’s walking the line for a negative impact.

From long-awaited updates to algorithms and changes in the way Google displays results, to the death of authorship and hits to guest post networking sites, there has been plenty to keep our attention. The following is a month-by-month play list of all the biggest news in the SEO industry this year…

silvester party konfetti partyJanuary

Content grouping in Analytics

Not technically January, but just before Christmas 2014, Google unveiled a new option in analytics to group content together.

This gave webmasters the ability see how a niche of their site is performing in terms of interest across all pages by assigning it to a group.

For example; say you have several individual blog posts across your site categorised as car sales that only add up to a few hits a month each – at a glance separately they may seem like numbers too small to worry about. By looking at these posts as a group, though, you can see how they perform overall as a much bigger piece of the puzzle.

Rounded data in Webmaster Tools no more

Google launched a new update to webmaster tools at the start of January meaning rounded/bucketed search query data was no longer rounded off but instead accurate numbers, making the measurement of query samples more efficient and accurate.

Wait before you send that reconsideration request!

Matt Cutts, head of web spam at Google, advised websites with manual penalties to wait a few weeks before doing a reconsideration request again if the last one was rejected. Why? Simple –they’ll ignore it.

This continued to cement the stance that Google wants to see evidence you’ve really learnt your lesson before they lift a manual penalty, and if they think you’re taking short cuts then you’re not going to have much luck. Evidence that work has gone into a reconsideration request is a big factor in getting a manual link successfully removed, and time is just another factor Google is using to gauge your efforts.

Authorship visibility cut in search

After much speculation in the SEO community that something in the SERPS had been changed to impact authorship display, Google cut back on displaying author snippets in search results by 20 to 40 per cent.

Colorful gerbera flowers and Valentine's day heart toyFebruary

Google cracks down on Rich Snippet spammers

Before February, some websites had used rich snippets and structured data in a manipulative manner by marking up data incorrectly, or data that isn’t actually on the page at all.

February saw the first publicly reported incident where a site owner has been issued a manual warning about misusing structured data markup for manipulative purposes. The warning read as follows:

“Markup on some pages on this site appears to use techniques such as marking up content that is invisible to users, marking up irrelevant or misleading content, and/or other manipulative behaviour that violates Google’s Rich Snippet Quality guidelines.”

Google cracks down on ads above the fold

In other Google-related news for February, Matt Cutts announced a warning to sites with too many adverts above the fold. An update to the algorithm meant websites making content hard for the user to find or read due to heavy use of adverts could be penalised.

This update mostly affected publishers using advertising as a revenue stream, although those already using Google’s Adsense monetisation platform were likely not affected due to strict T&Cs already preventing publishers from displaying ads aggressively whilst shadowing content.

This update was reported to affect less than one per cent of searches in Google.

Google advises against infinite scroll pages

Infinite scrolling pages have become commonplace in web design over recent years, partly due to the familiarity of Facebook news feeds and Twitter.

In February, Google put out a blog post advising against these for SEO purposes, as search engine crawlers cannot scroll or load these pages like a human user, meaning potential index problems on sites. Google instead recommended using pagination so users click through to pages 2/3/4/etc when browsing content and so crawler bots can do the same.

At the same time, Google also advised on the importance of proper SEO implementations for faceted navigation on eCommerce sites. Various filters and options to browse products can result in tens of thousands of thin and duplicate URLs indexed in search. These can trigger panda algorithm issues if not dealt with correctly – using no index attribution, the robots.txt file and canonical tags.

Bing confirms punishing poor grammar

Duane Forrester of Bing posted a blog in February stating that grammar is a signal used to assess site quality and therefore search visibility. Forrester confirmed that poorly written content full of typos and spelling mistakes can have a negative effect on your site’s visibility in Bing:

“If you [as a human] struggle to get past typos, why would an engine show a page of content with errors higher in the rankings when other pages of error free content exist to serve the searcher?”

Bing is looking for signals and patterns of regular poor grammar across whole sites, so don’t panic about dropping off page one for accidentally writing “teh” instead of “the” in a blog post, but grammar should be just as important to keep in mind for search as it is for readers.

On the contrary, Matt Cutts confirmed in a video that poor spelling and grammar in comments is not something Google looks at when assessing the quality of site and therefore will not have a negative effect on search visibility.

Backlink relevancy is still a massive visibility factor

In a video posted online in February, Matt Cutts confirmed that links from relevant and authority sites were still a big factor in helping a website’s search visibility and will remain so for the foreseeable future.

In the video he stated that internal tests done on search results, when removing backlinks as a visibility factor, caused a much worse set of results for the end user. Despite Google’s ongoing battle with manipulative backlink tactics, they remain a good quality signal around which to base search results.

Beste Mama!March

Google takes action against My Blog Guest

The biggest news in the SEO industry for March was Google taking manual action against MyBlogGuest.com. My Blog Guest had a huge chunk of its search visibility wiped out overnight, with the main domain removed from Google’s index completely.

My Blog Guest is a website that helps content creators to find blogs willing to publish their articles, with a heavy emphasis on linking to the author’s website as payment.

At the start of 2014 Matt Cutts made big waves in guest blogging circles by sending a clear warning to websites using low quality article placement as a means to build links:

“Okay, I’m calling it: if you’re using guest blogging as a way to gain links in 2014, you should probably stop. Why? Because over time it’s become a more and more spammy practice, and if you’re doing a lot of guest blogging then you’re hanging out with really bad company.”

The biggest surprise with the penalisation of My Blog Guest was that it isn’t hosting links in the same way a blog network would – but helping publishers and content producers find each other.

Google reviewing (not provided) in organic search

Since Google started promoting secure search (over https) in 2011, webmasters have lost access to information regarding the keywords that bring traffic via Google search. This data was replaced completely by the smoke screen “(not provided)” in Analytics.

However, at March’s SMX (Search Marketing Expo) West, Google Search Chief Amit Singhal hinted to a potential solution that will keep all parties happy:

“Over a period of time, we [Google’s search and ad sides] have been looking at this issue…. we’re also hearing from our users that they would want their searches to be secure … it’s really important to the users. We really like the way things have gone on the organic side of search. I have nothing to announce right now, but in the coming weeks and months as [we] find the right solution, expect something to come out.”

Although this news sounded promising, nothing else has been said in regards to Google’s stance on (not provided).

Google working on a new Panda update to help small businesses

Another big update from March’s SMX West came again via Matt Cutts, who announced his team was working on a version of the Panda algorithm that was less aggressive towards small business owners and their websites.

Despite the announcement, Cutts remained vague on specifics or when it would be released.

Links no longer count towards visibility in Yandex

In March, Yandex, Russia’s biggest search engine, started a roll out across its platform where links no longer counted towards a website’s visibility.

The first areas to have this change applied were a handful of verticals and only for results based in Moscow, with other areas to be included over the coming months.

Although this change was mostly only relevant to businesses targeting the Russian search market, it was the first time any major player in search discounted links as a metric.

Easter eggsApril

Google advises on how to link multiple domains

In April, Matt Cutts advised website owners on how they should interlink multiple related domains, such as regional sites containing the same content.

Here at Vertical Leap we have seen instances of websites being penalised due to cross linking across multiple domains, even when the implementation has been innocent and the reasoning has been common sense (after all, why wouldn’t you want to point customers to your other sites too?).

Although Google does all it can to figure out when domains are related, using HREFLANG tags in page headers to help Google identify and understand this information makes the job much easier.

Tell search engines how long actions take on your site

Schema announced a new form of mark up in April called “action”, which applied to specific things a user can do on a website.

An example of this would be a contact form – you can use structured mark-up that tells search engines how long the process takes the user to complete.

Google continues the fight against manipulative link tactics

Google publicly announced the discovery and penalisation of several link networks in March, including seven link networks based in Japan.

Google also took a second step against low quality guest posting, this time sending a manual penalty out to blogging hub PostJoint.

maypoleMay

Right to be forgotten

Perhaps the biggest news in the search industry in May was Google vs The EU’s ‘right to be forgotten’.

The catalyst for this case involved one Mario Costeja Gonzalez, who, after a six-year battle in court, successfully won against Google over a negative news story appearing next to his name in search results. He felt the story in question was no longer relevant as the matter had been resolved.

The European Court has been in favour of Mr. Gonzalez’ argument, so Google introduced an online form for EU citizens to request “inadequate, irrelevant or no longer relevant” information to be removed from search results.

Not long after the launch of the online form, Google had received an average of one request every seven seconds for information to be removed from its results, with more than 1,500 coming from within the UK.

Google launches two new updates

May also saw Google launch two major updates to its algorithm. The first was the ‘Payday Loan 2.0′ update, which targeted websites that aggressively spam for competitive terms like “payday loan” phrases.

The second was the Panda 4.0 rollout, which was believed to hit press release submission sites the hardest.

Sites rank higher if they use structured mark-up

A study released in May found that websites using Schema Markup and structured data rank were, on average, four positions higher than sites not implementing such practices.

Searchmetrics, which carried out the study, had this to say:

“It must be noted that this is not necessarily a causal relationship. It may not be the case that pages are actually preferred by Google just because they provide schema integrations, and maybe the higher rankings can be explained by the fact that webmasters who use Schema.org integrations are one step ahead of the competition due to other factors that affect their rankings in a positive way.”

Google state Schema mark up is used for how it displays search results and has no impact on position.

It was also found within the study that less than one per cent of all sites across the entire web are using Schema markup.

SEO Myths reviewed

A May article written on Bing blogs outlined 10 SEO myths based on popular opinion or misinformed knowledge.

Although not breaking news relating to the industry, it’s always nice to revisit old practices and refresh our memory on the basics. Key takeaways from the piece were as follows:

  • Ranking number one is great, but it shouldn’t be a priority; some keywords have different user behaviour patterns.
  • Title tags help, but they are not the be all and end all of SEO; you need to be looking at everything else alongside it.
  • Videos can be great for SEO if done right, however you need a plan of action; videos can also reduce site load time and should be part of an overall mix.
  • Buying ads or PPC won’t help your organic rankings.
  • Value and plan your content based on data such as engagement and bounce rate over your own personal opinion of what you think people will like; you might enjoy writing about one thing but your audience might prefer something else.
  • Links are still of huge importance, but they should be built organically.
  • Marking up data can help search engines understand your content better and help display rich snippets in results but won’t necessarily help you rank.
  • Although usability and SEO are different, both should be considered in equal measure; without SEO your site is missing out on visitors, but without usability those visitors won’t stick around for long.
  • SEO should be one part of a much bigger marketing mix to get the best results.

Google rewrites its quality rating guide

In May Google launched a brand new (not a revision!) quality ratings guideline document. The new guidelines have a strong emphasis on E-A-T, which stands for “expertise, authoritativeness, trustworthiness”.

What does this mean for your business? It means you should be offering expert advice, you should be establishing yourself as knowledgeable in your field and you should be giving people honest, correct information.

flag of UK on government buildingJune

Words on the page

After February’s news that Google still uses backlinks as a large ranking factor, a simple question was fired over to Matt Cutts: “How does content rank when it doesn’t have many links?”

Cutts answered simply – it all comes down to what words are on that page, as well as the value of the overall domain. There is also a deciding factor based on how competitive a particular keyword is – a less competitive keyword doesn’t require as much authority to compete in search.

Google removes author photos from search results completely

After January’s cut in the number of author profiles appearing next to articles in search results, June saw Google remove headshot snippets from its results completely.

According to Google, this was done in the name of “creating a better mobile experience and more consistent design across devices.” Author mark-up is still displayed in results, but without a photo.

This was predicted to have a negative aspect on click through rates, despite Google stating that A/B tests showed no difference in numbers.

Bing best practice for sitemaps

June saw Bing post a new piece on its webmaster blog listing best practice tips for uploading an XML sitemap to your site. These were as follows:

  • Follow the sitemaps reference at www.sitemaps.org. Common mistakes we see are people thinking that HTML sitemaps are the same as XML sitemaps; malformed XML sitemaps; XML sitemaps too large (max 50,000 links and up to 10 megabytes uncompressed) and links in sitemaps not correctly encoded.
  • Have relevant sitemaps linking to the most relevant content on your sites. Avoid duplicate links and dead links: a best practice is to generate sitemaps at least once a day, to minimize the number of broken links.
  • Select the right format: (a) Use RSS feed, to list real-time all new and updated content posted on your site, during the last 24 hours. Avoid listing only the 10 newest links on your site, search engines may not visit RSS as often as you want and may miss new URLs. (This can also be submitted inside Bing Webmaster Tools as a Sitemap option.) (b) Use XML Sitemap files and sitemaps index file to generate a complete snapshot of all relevant URLs on your site daily.
  • Consolidate sitemaps. Ideally, have only one sitemap index file collating links to all relevant sitemaps, and only one RSS listing the latest content on your site.
  • Tell search engines where sitemaps live by referencing them in your robots.txt files or by publishing the location of your sitemaps in search engines’ Webmaster Tools.

 Spam Algorithm Version 3.0

Just two weeks after Google launched Version 2.0 of its spam algorithm update, it released version 3.0. These updates again focused primarily on search queries and keywords in sectors commonly used by spammers (like “payday loans”).

The updates were specifically designed to judge relevance between linking pages and link text.

Blackboard with School's out text on the beachJuly

301 redirects and negative SEO?

In a Google Hangout video in July, Google’s John Mueller claimed a competitor cannot create a negative impact on your site by creating a redirect to it from a penalised domain. He said Google could easily catch any site playing such a dirty trick.

He also said if you have any concerns just use Google’s disavow tool.

Using Adobe Flash? Time to upgrade

July 15th saw Google issue a big warning shot to sites built with Adobe Flash technology. The search giant would issue a warning to users upon clicking if a flash site will not function fully on the device.

Google’s Keita Oda, Software Engineer, and Pierre Far, Webmaster Trends Analyst, said:

“Fortunately, making websites that work on all modern devices is not that hard: websites can use HTML5 since it is universally supported, sometimes exclusively, by all devices.”

New robots.txt tester launches in GWT

July also saw Google launch a big upgrade to the robots.txt section of Webmaster Tools. The upgrade highlighted errors, allowed users to edit the robot.txt file from GWT and see older cached versions of a robots.txt file.

eastbourneAugust

Authorship killed off after a slow death

After two earlier updates in 2014 gradually closing the door on authorship markup, Google announced in August it would stop showing the results in search altogether, and would no longer be tracking data from content using rel=author markup.

This was because Google felt authorship hadn’t returned enough value compared with the resources it takes to process the data.

John Mueller, Webmaster Trends Analyst at Google, said he felt the reason for the experiment falling short was a low adoption rate by webmasters and that it offered little value to results in terms of clicks.

https://

August also saw Google announce it would be incorporating https into its algorithm, meaning secure sites with an SSL certificate could see some ranking improvements.

Penguin 3.0 on its way?

There had been a lot of speculation in the SEO industry all month regarding when Penguin was going to update, and some signs of fluctuation in search results suggests Google is running some experiments and/or updates.

autumn backgroundSeptember

Google confirms refresh is needed to recover from Penguin

At the start of September, John Mueller confirmed in a Google Webmaster forum thread that to truly see a recovery from being hit by Google’s Penguin algorithm, you need Google to roll out a Penguin update.

Google also confirmed there should be an update before the end of 2014, much to the relief of many who felt this update should have already occurred months ago.

Bing combats keyword stuffing

Also in the first half of September, Igor Rondel, the Principal Development Manager of the Bing Index Quality, confirmed the launch of a spam filter in its search results to combat keyword stuffing in URLs.

According to Rondel, this affected three per cent of all search queries.

Search in a search

September saw Google offer a new form of schema mark-up which was incredibly useful for websites that rely heavily on visitors using a search function, called “Search Sitelinks”.

This meant if a user searches for your brand, Google may show a box within their search results, for users to find the content they are looking for faster.

Google takes out private blog networks

In its ongoing battle against web spam and manipulative search results, September saw unconfirmed reports that Google had detected and taken action against a cluster of private blog networks (PBNs).

PBNs are usually a number of sites all under a single ownership that not only interlink to pass around SEO authority, but then link externally to other sites (usually for a fee) to pass on that authority. Although these blog networks are called “private”, they are still accessible to both Google and the public; the term is applied for the fact these sites are not built with the intention of being visited.

Sites detected using PBNs for SEO performance will have received a manual action warning in Google Webmaster Tools.

Panda 4.1 arrives

Towards the end of September, Google started rolling out a refresh of its Panda algorithm, which combats thin, duplicate and low quality web content.

This was confirmed by Pierre Farr, a Webmaster Trends Analyst at Google, who said:

“Earlier this week, we started a slow rollout of an improved Panda algorithm, and we expect to have everything done sometime next week. Based on user (and webmaster!) feedback, we’ve been able to discover a few more signals to help Panda identify low-quality content more precisely. This results in a greater diversity of high-quality small- and medium-sized sites ranking higher, which is nice. Depending on the locale, around three to five per cent of queries are affected.”

Autumn decorationOctober

Google starts rolling out a Penguin refresh

The biggest news this month in the SEO world was that Penguin would finally be refreshed after more than a year of waiting. Here is what we’ve been told since:

  • Penguin 3.0 began rolling out on Friday 17 October.
  • No large-scale shake-up in Google results could be detected by many organisations that track SERPs.
  • The update currently affects less than one per cent of US English queries (as of 27th October).
  • A worldwide roll-out is likely to last several weeks.
  • The full impact of the update cannot be assessed until roll-out is complete.
  • This update is meant to help websites recover from previous iterations of Penguin, as well as demote newly discovered spammy sites.

As of last week, this update is still rolling out.

Don’t forget the Pandas too

Alongside the Penguin refresh, Google also rolled out the second update of 2014 to Panda at the start of the month.

Research showed that thin sites were hit hardest; further evidence for the damage rogue URLs and pages with little-to-no information can cause for SEO.

RSS feed or XML sitemap for optimal crawling?

Both XML sitemaps and RSS feeds help search engines to crawl sites. XML sitemaps usually tell spiders about your higher level pages, whereas RSS feeds provide real-time updates to things such as blog content.

Google stated in an October blog post you should be using both to help it crawl and index your content:

For optimal crawling, we recommend using both XML sitemaps and RSS/Atom feeds. XML sitemaps will give Google information about all of the pages on your site. RSS/Atom feeds will provide all updates on your site, helping Google to keep your content fresher in its index. Note that submitting sitemaps or feeds does not guarantee the indexing of those URLs.”

Webmaster Tools adds mobile usability

October saw Google webmaster tools add a mobile usability tab. This section assesses mobile sites for things such as font size, use of flash and the size of touch elements.

As we dive deeper into a multiscreen world, where 30-35 per cent of web traffic comes from a mobile device, responsive sites and being mobile-friendly is becoming increasingly important.

YARGH, I’M A PIRATE

In further Google algorithm update news, Google launched Pirate towards the end of October. This was to combat illegal downloading links in search results, with torrent sites reporting big dips in search traffic.

November and December

Google rolls out more changes to Penguin

Despite Matt Cutts once stating Google would never roll out an algorithm change on a national holiday, signs of Penguin recoveries were seen by many website owners over the course of Thanksgiving, however many of these were again rolled back a week later.

Many webmasters then reported further changes to Google visibility on the 5th December.

Google takes Penguin full time

Following continuous fluctuations during the Penguin roll out, a Google spokesperson issued the following statement:

“That last big update is still rolling out — though really there won’t be a particularly distinct end-point to the activity, since Penguin is shifting to more continuous updates. The idea is to keep optimizing as we go now.”

Although not set in stone just yet, it hints that Penguin will now be a continuous update rather than updated periodically.

What a year!

That’s what we love about SEO – it’s so incredibly fast-moving and never fails to keep us on our toes. Bring on 2015 – we’re ready for ya!

Share this article

About the author

Dan Callis

Dan Callis

Dan Callis is an SEO Campaign Delivery Manager at Vertical Leap, with a background specialising in content development, content promotion, off-page SEO and link building. In his spare time he can usually be found hanging out at a gig, riding around on a skateboard, cheering on Ipswich Town FC or sat on his sofa playing Sega Mega Drive / watching pro-wrestling.