Negative SEO is a Reality And How Can You Protect Yourself Against It


1. What’s Negative SEO?

Now here’s something most SEO agencies don’t like to talk about – how real negative SEO actually is.

It’s like those ‘reverse polarity’ tropes you see in lazily written sci-fi movies where the writer doesn’t bother to do proper scientific research to lend realism to their science fiction.

It’s shady and black hat SEO tactics designed for a single purpose – to tank, instead of rank, a target competitor’s website in the search results.

Now the slightly more intermediate SEOs among you, or perhaps those of a more ‘puritanical white hat’ persuasion may say Google is too smart to detect fake, inflated and artificial build-up of irrelevant, spammy looking links and can disregard them when it comes to evaluating where a website should be placed in the rankings for particular keywords.

But we’d like to ask the doubters if their opinion is based on what they read from the cherubic Google choir boys on MozSearch Engine Land or Search Engine Journal, or based on hard empirically obtained data.

Thing is, we have personally seen the effects of negative SEO, and have been on the receiving end of it, and also unleashed it as a retaliation upon some nasty perps who fired first (as you can see I am no pacifist when it comes to protecting my clients).

2. Types Of Negative SEO.

To us, in our ontology, negative SEO falls under 3 types.

Self-Inflicted Negative SEO.

This is when you have misguided or outdated SEO information and you inadvertently do those things that will cause your site rankings to tank instead of climb, often disastrously.

These counterproductive self-sabotaging actions need not be as no-brainer as mass automated link building. It could be mistakes as easily missable as not managing proper anchor text distribution, or over optimisation, or unnatural irregular link acquisition velocity.

The solution to this is to always keep yourself updated of the latest in the SEO industry, and take your knowledge from all bands on the whole black hat – white hat spectrum, and form your own hypotheses and test to obtain your conclusions.

Offensive Negative SEO.

This type of negative SEO is for Sith practitioners on the Dark Side. It’s aimed at bringing down competing websites rankings and involves any number of the methods that will be outlined below.

If you’re on the receiving end of this, the best weapon against it is still knowledge. We’ll go through what you can do in response to offensive negative SEO in the later parts of this article.

Defensive Negative SEO.

This type of negative SEO is used either to bring down webpages that give false or compromising information about your brand (hence this falls under reputation management), or to retaliate against known perpetrators who had done offensive negative SEO against you.

The methods employed for this purpose may be the same as those used for offensive negative SEO, except for certain methods that actually cross the line into illegal territory. We, of course, strictly advise against engaging in illegal practices such as hacking and installation of malware, etc.

3. How Negative SEO is Done.

Warning: the following methods are outlined here for educational purposes only, so that you may be aware of what unethical competitors may do to you and preempt and respond accordingly.

DDoS Attacks.

denial of service attack is a form of cyber warfare where the attacker aims to make a website or network resource unavailable to  users by overloading the target server with a huge number of requests that are too much for the server’s bandwidth to handle. In a distributed DoS attack (DDoS attack), the requests flooding the target server comes from (or appears to come from) many different IP addresses, making it near impossible to stop by merely blocking a single source.

The result of this is a website that loads very slowly, times out or does not load at all. The real life analogy to this is sending hundreds of ‘bribed’ actors to crowd a store and make annoying requests of the staff, taking their time, energy and attention away from serving the real customers who are by then unserved and don’t want to visit the store anymore.

How does this affect the rankings of a website?

One of the factors Google takes into consideration when ranking your site is “user experience“. That’s just a fancy term to mean how enjoyable your users’ (aka visitors, aka your potential customers) experience of your website is. Does it load fast? Is the information presented in a neat and easy-to-read way?

Now if you understood the real life analogy given earlier, a slow or unloading site annoys the heck out of your potential customers, they click ‘back’ or close the browser tab as soon as they land on your site. Google tracks this behaviour and this leads to Google (sort of) thinking, “It seems this site that I’ve sent people to seems to not satisfy them. Maybe I should look for other sites that may satisfy their needs”.

If this happens often enough, and your website suffers a significant amount of downtime due to these attacks which you don’t prepare for and have countermeasures against, your rankings will drop significantly to give way to other pages that serve up the content fast.

Fortunately, there’s a way to protect your website from this. It’s called a CDN (content delivery network). There are other SEO and general security benefits of using CDNs, but this will be for other articles. Just Google “CDNs to prevent DDoS attacks” or look for companies like Sucuri, CloudFlare, or Amazon Web Services.

Mass Link Proliferation & Deletion.

If you’re on this page, you should know it’s no secret that links are still an important part of the Google ranking algorithm. The AI is improving and the algorithm evolving but there’s always space for loopholes and blind spots which nefarious black hatters can exploit to game the system to bring down your site rankings.

One of those ways is to trick the algorithm into thinking that your website is getting and then losing a massive number of links. This is done by artificially inflating the number of links pointing to a target site causing a temporary climb in rankings at first, only to systematically remove all those links and running that list of dropped links through a mass link indexing / pinging service.

How does this bring down your site rankings?

Whenever a new link is built the first time, it would take some time for Google to update their records and register a new link to your site and count it as a vote for your site and hence push it up the rankings. A quicker way for Google to index this new link is through those link indexing / pinging services. It basically alerts Google telling it, “Hey, look! Here’s a new link to”.

This method of negative SEO exploits this by alerting, or ‘pinging’ Google after those artificially created links are deleted. It sends a signal to Google saying, “Hey, Google! You might wanna update your index again, those links that you registered earlier? They’re gone now, looks like those voters have changed their mind about the site they were linking to”.

Do this often enough, at a large enough scale, it sort of ‘screws’ with the algorithm into thinking the target website’s popularity is experiencing a downtrend. And so, like the frantic emotional panicking crowd that drives down the price of a stock in a bearish run, it creates an artificial drop in the rankings.

How do you protect yourself against this?

Luckily, this method of negative SEO actually requires quite a bit of time, effort and monetary investment to pull off. It’s not exactly as straightforward as paying someone to push a button to send thousands of bots to attack a site. There is some tactical timing and resource allocation involved.

But if you happen to rank for a keyword or play in a niche that is highly profitable, the likelihood of high profile attackers doing this also increases. In such a situation, it is best to do what all good big businesses do – get some insurance and diversify.

Don’t just go for the competitive high traffic ‘money’ keywords in your industry. Spread out your coverage and rank for hundreds or thousands of longtail keywords that can stream substantial traffic to your business altogether. This negative SEO tactic usually works on only 1 or a few closely related keywords at a time. It would be counterproductive and a massive waste of resources for an enemy to go after all the keywords you are ranking for.

Also, controlling your link velocity, link quality, link diversity and anchor text helps give the Google algorithm a way to differentiate between the good links and the bad. Stay patient and don’t be reactive to rank drops as a result of this by taking some brash actions like splurging all your money on buying more links to replace the dropped ones. Be proactive by doing periodic and regular link disavowals of the bad links.

Malware Attacks & Site Defacements.

This is pretty self-explanatory, a very common problem yet easily avoidable.

Through loopholes and blind spots in open source content management systems such as WordPress, or Joomla, hackers and blackhatters can exploit and ‘enter’ the backend of your site without authorisation and do to your site whatever they want to do, as they have the same ‘administrative power’ to control your site as you.

The most common feature I have seen of this type of attack is corruption of the source code of the site, defacing the homepage by replacing it with some group identifying flags, logos and slogans (usually evoking the hacker group they belong to).

The worst I have seen is when a hacker had gained backdoor access to a client’s site and was able to insert entire folders of malicious files that cause Google to flag this site as a potentially malicious phishing site, causing it to be blacklisted and hence removed from search results entirely.

Usually the easiest and fastest way to resolve this is through services like Sucuri.

How to Clean a Hacked WordPress Site

Anchor Text Attacks.

The Google Penguin update has changed how the search engine algorithm assesses the relevance of a page to its target keyword. Back in the day, it used to be very easy to rank for high value keywords by creating hundreds or thousands of links with the target keyword as the anchor text.

People start abusing this easily manipulable tactic and spammy, low quality pages start ranking above better quality ones just because they have more links with the target keywords in the anchor text pointing at them. So Google decided to update the algorithm and factor in an assessment component for how naturally would it be for a site to have all or a majority of its inbound links have target keywords in their anchor text.

And to simplify the story a bit, the concept of ‘anchor text ratios’ was born. New ‘rules’ are in play, and SEOs start panicking too much about the percentage of links pointing to their target money sites with keyword optimised anchor texts.

With this algorithm update, a new negative SEO tactic is also born. This won’t affect aged, high authority domains or pages that have ranked at the top thanks to good old fashioned organic (or editorial) link earning from other relevant high authority sites.

The sweetest victim for this killer is that new up and coming website trying to break in the top 10 spots of the SERPs through months of sustained grey hat SEO campaigns. The prime suspect is usually either a site that has been within the top 10 spots for long and want to defend its position from a usurper, or someone in the same starting position as the victim and competing at the same time, but is slightly behind this victim.

The way it’s done is pretty straightforward – sabotage the anchor text ratio of the target victim before it attains that status of a ‘high authority’ (and hence infallible and untouchable) site.

While the victim is trying to play by the rule book, observing ‘healthy’ or (supposedly) ‘white hat’ anchor text ratio rules (like never repeating an exact or close match anchor text more than once, or keeping ’em under a certain arbitrary Guru-sanctioned percentage), the attacker goes on to build similar looking links with the only difference being the anchor text in question, often using the same guest posts, or PBNs or UGC sites or press releases to make the Google algorithm think it’s being done by the same person.

4. Concluding

The best way to protect your website from such a sneaky attack is to not be reactive when it occurs. Continue building links, developing content and engaging on social media following best practices, as if you’re a white hatter, and be patient.

It takes quite a bit of time, effort and resources on the part of the attacker to sustain this, and over time, he will stop.

If you continue to build links and refresh your content regularly as you normally would before this attack occurs, the rankings will eventually sort itself out in a couple of weeks or so.

Share This Article