There are a lot of posts out there talking about negative SEO; impact areas, demystifying what negative SEO actually is and ways of protecting your site. Surprisingly, though, there seems to be little written about practically helping people to find out if they have been hit by negative SEO or not – which is where I hope I can help out.
Have you knowingly looked to manipulate search results?
This seems like an obvious question to ask – especially if you are seeing a decline in performance. I would start with this question and detail everything meaningful that you have implemented with the direct intention of ‘improving organic search results’.
At this stage, if you are unsure about the difference between search manipulation and search marketing, check out a recent(ish) post of mine here.
Ideally, look to share this information with someone you work with (an objective party) and if there is anything that you are hesitating to share with a colleague or peer at this stage, I would emphasise this as a priority within the actions to review and address.
What does the data say?
If you haven’t knowingly ‘dabbled’ with spam SEO techniques, or (after full review) focused on manipulative search tactics for quick wins – unwittingly or otherwise – it is time to take a step back and look at the data.
At this point I would look at Big Data level – for me this simply means all the relevant data. Don’t just limit your review to subjective metrics like ‘the telephone ringing’, but don’t exclude them either.
There are two things you are looking to achieve at this point: 1 – information to confirm or rebuke whether things have declined and 2 – signals of concern (here I am referring to areas that you would want to action regardless of negative SEO impact; manual penalty or signals of algorithm impact from Penguin or Panda, for example).
What tools should you use to identify negative SEO?
The focus areas that I would think are applicable to most businesses include those detailed below.
Google Webmaster Tools (GWT)
Focus your time looking for tangible impact on search queries, feedback on any manual actions, security issues (looking specifically as evidence of your website being hacked) and latest links to your site. Other areas of note would be internal linking as this is often a key on-page signal for sites that have been hacked (massive spikes in exact match internal links to single pages).
As a tip, when looking at the search queries don’t just look at the top level. Apply filters to your core business areas and look at both ‘Top queries’ and ‘Top pages’, as both can provide valuable negative SEO indicators.
Google Analytics (GA)
Pay specific attention to post-click metrics. Assuming you have integrated your GWT data with your GA account, you can also look at extra data manipulation with pre-click data too, although most of this you will have covered in the GWT actions.
Spend the majority of your available time on non-paid search metrics and look at them in the wider context of changes to other segments for ‘unusual trends’.
Look at the data over the longer term (as far as 24 months plus) and simply look at the data without applying any comparative data sets or filters for an overview of the situation.
Sometimes this can be the step that people overlook as they are driven to find proof of negative SEO, or quite often evidence of a penalty or algorithm impact regardless of clear initial justification. Don’t forget to vary the overview timeframe as this can impact initial overview findings quite dramatically.
By varying the timeframe this can also help identify meaning to data peaks and troughs. For example, offline activity like a nationwide radio advert can impact traffic and this can be tracked by region and more, then filtered out accordingly for a normalised data set.
Consider year-on-year comparisons (if and when you have identified a tangible performance change – most likely a severe decline) especially with regards to content drilldown sections of the site and core landing pages. This will help to define a prioritisation of attention required, based on the greatest negative impact dates.
Any way in which you can refine and prioritise your focus areas (after the high level overview) will only assist in narrowing down the resources required to take practical steps to penalty recovery.
If you have diligently added annotations to key events in your GA account, this will also be useful as traffic peaks, substantial changes in conversations (goal and ecommerce tracking) and others can often be correlated to other correlated actions – i.e. new website design and usability testing.
Link tools for penalty evaluation and negative SEO signs
I’m not in the habit of recommending external link monitoring and evaluation tools as there are enough review sites out there and coverage of this on industry sites to make up your own mind. But what I would say at this stage is; sign up to at least 3 or 4 link tools; collate the latest and historical data exports for all of them, combine, de-duplicate then verify the integrity of the data and use this master links list to identify the key link concerns that are present. Again, don’t assume there must be something there as this will skew your approach – simply let the data tell the story.
Most link analysis tools will provide a summary screen on which to focus your attention. Activity here will be best spent looking for anomalies. With links, this starts with peaks in both new and lost links, unaccountable link activity (i.e. not tied into marketing activities of which you are aware), top level referring domain relevancy, link anchor text percentages, link numbers (total and changes), top linked-to pages, sitewide links and dofollow links.
This might sound like a lot of data areas for link reviews in isolation. However in most tools available, this is either segmented for you by default or an easy option to select. Then, it can often be one of the clearest areas for evidence of negative SEO (or other penalties).
At Vertical Leap we have the benefit of our proprietary technology Apollo, which provides us with insight not only into our online performance spanning core data but also our ever changing competitor environment too.
Identifying movers and shakers in your immediate (and wider) online competition zones can provide signals of industry changes, new entries into your market as well as short-term changes to the online environment.
An example of this is large-scale news coverage of an associated industry event or highly topical story that has driven down all service providers over specific search topics. This would be due to the intermittent inclusion into the search engine results pages of highly authoritative news sites.
(and I don’t mean this to be derogatory at all)
All too often, people overlook the obvious when they are performing technical reviews, simply because the data can seem overwhelming or the only thing that matters.
This is why chatting through your thought process with a colleague can be useful. Ideally this would be a peer who has the technical mindset to appreciate the logic but also the practical ability to step back and see the bigger picture.
Ask marketing teams about telephone activity, or even better, look at call tracking stats, if present. Also, chat to key departments about activity. This does not have to be limited to declines, in fact I would always ask an open question and see what comes back. Lastly, don’t forget to look at the integrity of data as well.
- Is tracking code on all pages?
- Has traffic decline coincided with a new site launch?
- Are there key search engine updates and rollouts of algorithms that could be change triggers rather than negative SEO?
Don’t rule out the things you know to be true.
Most SME owners and key influencers in larger organisations will be aware of company-specific hero terms and historical search queries for which you always appear 1st.
These are the ones you like to demonstrate in meetings and type into a search engine on a Monday morning to remind yourself how great online marketing can be.
Spend a few moments on incognito search typing in a few of these terms and take a mental note of any changes or concerns.
Don’t spend more than 5 minutes on this, though. Literally think about brand, core online areas of historical dominance and specific niches in which you know you have been prolific to date. This can be a ‘common sense’ check and a great way to step away from the metric-based data for a five minute logic break.