Skip to the content

Don't Lose Customers Because Of Negative SEO

In the changing world of search engines, especially Google, getting to the top of the search rankings is requiring more and more time and investment. It is little wonder that more dubious webmasters (or digital consultants/agencies) are thinking about turning to negative SEO. Google’s own search trends illustrate the boom in negative SEO since early 2017.

Google Trends showing negative SEO is on the increase

Based on the trends, it would appear that negative SEO is twice as popular as white hat SEO and is nearly four times more popular than ethical SEO. Has Google made it too difficult to perform good old fashioned SEO?


Let’s look at what negative SEO is.

Quite simply, it is lowering your competitor’s rankings in search rankings.


How is negative SEO performed?

There are multiple ways of doing this:

  • Scraping the content from your competitor’s website and reposting it on other websites – so it looks like mass content postings or plagiarising by the competitor themselves
  • Hacking a competitor’s website, altering their content, installing malware, or injecting other technicalities which cause the website to not comply with Google
  • Building unnatural links to your competitor's website so that it falls foul of Google’s guidelines – this is the most common method
  • Using technology (website loaders and click-through bots) to fool Google – this is a lesser known technique


Off-page Negative SEO Link Building

Google’s latest guidelines on Link Schemes can be found here. At the time of writing, the link schemes which could negatively impact a websites’ ranking include:

  • Buying or selling links that pass PageRank
  • Excessive link exchanges
  • Large-scale article marketing or guest posting campaigns with keyword rich text hyperlinks
  • Using automated programs to create links

Additionally, Google outline the following types of links which are commonly known as unnatural links:

  • Text advertisements that pass PageRank
  • Advertorials where payment is received for articles with links that pass PageRank
  • Optimised links in articles or press releases
  • Low quality directory or bookmark site links
  • Keyword rich, hidden or low quality links embedded within widgets
  • Widely distributed links in the footers or templates of other sites
  • Forum comments with optimised links in the post or signature

As you can see, there are many opportunities to create a scenario where it looks like a competitor is performing these themselves – but in actual fact, it is a negative SEO attack on them.

Google also list in the guidelines “You can prevent PageRank from passing in several ways, such as: Adding a rel="nofollow" attribute to the <a> tag” – I will come back to this point later on.

If you believe the hype from Google’s various spokespeople over the years, then having one or two spammy links which violate the guidelines above, is not going to cause a website to tumble down the rankings overnight. It usually requires a lot more. Negative SEO attacks typically adopt building lots (1,000s, 10,000’s perhaps even 100,000s) of links from unrelated link farms or groups of websites.

For example, imagine you are a “London plumber” with a website offering your services, then over a short window of time, 1,000s of links started coming from websites in China, Russia etc. Especially if these links came from low-quality directories or forums with optimised comments that were unrelated to plumbing – this should wake up Google and trigger a decline in rankings.

Likewise, if we assume the scenario where highly optimised anchor links, say “London plumber” started being generated in large volumes over a short period of time to the plumbing website – it may look to Google like the plumber or his SEO consultant/agency were doing this themselves to aid SEO. When in fact, this was a rival plumber in London who had adopted unscrupulous negative SEO techniques to trounce his local opponent.

The graph below from Google trends highlights the growth in people searching for negative SEO related terms.

Google Trend showing negative SEO on the increase

Proactive Guarding Against Negative SEO Link Building

Unfortunately, no-one can stop a negative SEO attack on their website. If someone (a competitor) wants to embark on this activity, there is nothing to stop them doing it (to date). As the website owner, it falls on your shoulders to be aware of the potential of an attack and to proactively monitor for one occurring.

In the case above, where a campaign of backlink generation is the source of the negative SEO attack, monitoring and recording the number of backlinks found to be pointing to your website each month in a simple Excel spreadsheet would be a good start. Noticing any spike or massive increase from one month to the next, where you yourself were not knowingly creating them, should send the alarm bells ringing.

If you don’t have the resources to monitor the links yourself in a spreadsheet, there are numerous 3rd party commercial software tools you can subscribe too to monitor links for you. (Caveat: Given the nature of the internet, no software tool is 100% accurate).

However, performing some form of proactive monitoring is better than no monitoring at all. Hiring an SEO consultant with many years of experience in Google penalties is something I would recommend.

An experienced consultant will be able to determine:

  • The volume of backlinks to your website
  • A reasonably accurate date/time of when these links were found
  • The type of link i.e. forum spam, low quality directory, large-scale article sites
  • Is the link a follow or nofollow link
  • The country of origin of the website link
  • IP address or C block of the back linking websites
  • The potential toxicity of the links

Whilst no system is fool-proof – hiring someone who has dealt with Google penalties and knows the signature/signs of a penalty, can proactively help guard against one.


Scraping Content – Duplicate Content

It is fairly easy to copy & paste. Anyone can highlight content on a website, right click -> copy, go onto another website which accepts articles, press releases, blogs, forum comments, or even build a quick WordPress site and then right click -> paste.

In minutes, you can create a duplicate of a competitor’s website.

Google is quite clever, they should know who originated the content first. As Google bots crawl the web and find new pages on websites, the pages are stored in Googles index/cache and are date/time stamped. Based on the timestamps, this allows Google to determine who had the content first and who must have copied it.

However, even Google is fallible. What if Google finds the copied/plagiarised version first (Version B), date/time stamp this and then crawl the website who originally created the content (Version A) two days later?

In this scenario, it will appear that website A has stolen the content from website B. If this happens often enough, then Google will assume website B is the innocent party and rank this website, whilst it will assume website A is the guilty party and not rank them.

If someone is performing a negative SEO attack on you, then they can use website scrapers to scrape the content pretty much as soon as you put it online. If the scraper site is more frequently visited by the Google bots, then it may well be that it gets ranked and the originator doesn’t = a drop in rankings for you.


Proactive Checking For Duplicate Content

It is important that you check the web for anyone scraping content from you (without your permission) and potentially creating the scenario where you could fall foul of a Google Panda penalty.

Probably the best-known tool for checking for duplicate content is Copyscape.com, though a quick Google for “duplicate content checkers” will return a list of other good providers.

On a basic level, you can, of course, use Google themselves for a quick check. If you take a sentence from your own website and wrap it in quotes, then Google that sentence, you will find the instances that Google themselves have indexed.

For example, if I take the sentence above:

“Probably the best-known tool for checking for duplicate content is Copyscape.com”

and Google that, it will return no matching results (apart from this post – hopefully!), whereas if I Google:

"The software lets you detect duplicate content and check if your articles are original."

which is taken from Copyscape’s own website, then it returns 2,920 matching results.

This simple exercise provides proof that Google will ignore what it decides are duplicates. If you view the bottom of the SERPs you should see the message:

Google search results ignoring pages with duplicated content

A quick calculation will illustrate that from the 2,920 original results, Google believed 2,821 (2,920 – 99) web pages where duplicates of the Copyscape page, so these sites will not rank highly and as a result, will not increase traffic.


Negative SEO - Denying Website Access

Google webmaster support clearly states that blocking Google from accessing your website “may lead to a loss of ranking in Google’s search results”.

Google guidelines showing webmasters not to block the googlebot

The webmaster support page also outlines common causes such as crawl errors - where the Googlebot encountered issues whilst trying to crawl the web pages and issues with DNS servers - where the bot couldn’t access the website each time it tried.

Given that Google points out that a website will drop down the ranking if these issues are experienced, it has been known for negative SEO attacks to use techniques to intentionally drop a competitor’s website down the search results.

Such techniques may include crawling a competitor’s website and downloading large images continually to overload the website/server in an attempt to crash it. This means when the Googlebot tries to crawl the site, it is too slow, or it can’t crawl it at all.

Clever negative SEO attacks of this nature will forcefully crawl the website from different IPs and do it in the early hours of the morning when the website owner or users might not normally be using the site. This can go unnoticed for weeks or months until search rankings start dropping.


Proactively Checking For Googlebot Denial

Google provides their Search Console (aka webmaster tools) for a reason, so it is essential that it is activated on each of your websites. Learn what each of the sections within Search Console is telling you and monitor them on a weekly basis.

Google Search Console checking for crawl and site errors

If you don’t have the skill set to do this, hire an SEO consultant who fully understands Search Console and can read between the lines to decipher the health of your website.

Google Search Console watching for crawl rates

Another area to look closely at is your website hosting provider. They should be able to provide server logs as well as traffic graphs so these can be proactively monitored each week or month for unusual activity.


Negative SEO - Faking Click Through Rates

This is a highly controversial area in the SEO industry. Some SEO experts have allegedly proven that click through rates (CTR) on organic search results does not affect rankings, whilst others have allegedly proven that it does. One thing for certain is that only Google know if it does or doesn’t.

Back in 2016, a Google engineer appeared to hint that user clicks did have some influence in the ranking algorithm (see his slide below).

Google ranking algorithm is using CTR to rank websites higher


Whilst Google is changing their algorithm all the time, if we assume that user experience, brand loyalty, brand voice etc. are all relevant to ranking positions, this indicates that the CTR of search results should have an influence on the ranking positions.

It also follows some common sense. If users search for a phrase “cheap car insurance” and 10 results are shown in the SERPs. But there is a higher ratio of users clicking on the website in position 3 and continue to browse that website for some lengthy period of time (which will undoubtedly have Google Analytics installed and that will inform Google of this), then you would expect Google to notice this and question if those sites in positions 1 & 2 are relevant and worthy of those higher rankings.

In this case, perhaps Google will increase the ranking of the site in position 3 (in doing so decrease those in positions 1 & 2). I would then expect Google to monitor the change to see if the site previously in position 3, now position 1, performed as well as before. Likewise for those sites now demoted down.

Back to the point of faking click through rates. If we assume that CTR does affect rankings, then it is possible to build a CTR bot which appears to mimic human search behaviour and in turn click on all the search results, except the website under the negative SEO attack.

By employing this technique, slowly the website under attack will decrease down the rankings and lose traffic and therefore sales.


Guarding Against Fake CTR

I will repeat that this is a highly controversial area, and there appears to be no cast iron statement from Google on this subject or a Google webmaster guideline.

However, if your website is performing poorly and you have checked and double checked all other areas of ethical SEO and they are clean, then this is an area to look closely at.


On-page Negative SEO

On-page negative SEO is more difficult to perform as those undertaking the attack need to hack your website. On-page techniques include:

  • Technical changes such as modifying your robots.txt file
  • Changing your website content
  • Just hacking your website

Very few website owners will check the content of all the pages on their website regularly. This leaves content open to subtle changes which can see the site contravene Google’s quality guidelines resulting in your site dropping down the rankings.

Again Google explicitly state that those who provide a better experience will enjoy better ranking.

Extract from Google Quality guidelines for webmasters

Google outline the specific guidelines which could see your website expelled from the rankings following a manual penalty. Interestingly, of the first 13 points below, 12 of them are related to activities a website owner can undertake on their website (or a negative SEO attacker), only one relates to back links. The last two points relate to what others can do to your website.

Specific Google guidelines for back links


Given the plethora of negative SEO opportunities there are, it is important to keep a close eye on your website.


Content Changes

Generating spammy content would certainly be noticed by someone, however hiding unsavoury content or spammy links can easily be done by using a display:none tag in HTML. This will not be discovered unless you start looking at your source code or perform an outbound link check.

Other techniques commonly used are sneaky redirects or cloaking. Redirects simply take a user from your website over to another website (normally not owned by you). Where this can become dangerous, is when your website or web pages are directed to unsavoury or other hacked websites. Google may penalise you for redirecting to a malicious website.

Cloaking is when your website displays different content to the search bots than to your human visitors. As the website owner, when you view your own website you will see the content you expect to be there. But the hacker has created different pages/code that gets presented to the search bots when they view the website.

Google sees this as a blatant attempt to try and hoodwink their Googlebot and will penalise your website.

Another technique may be to add the content=”noindex” attribute to a META tag on key pages.

<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">

This will inform the search bots not to index this page in their SERPs. Over time, the bots revisit the pages, acknowledge the tag and then gradually remove the pages from the search results. You will gradually see a loss in rankings and traffic. Often webmasters or inexperienced SEO consultants will miss something this simple as they look for a bigger problem.

As I am telling you this, we know the bigger problem is the website was hacked!

There are numerous software tools such as Screaming Frog, Link Assistant’s Website Auditor that can help pinpoint redirects, highlight nofollow, noindex tags etc.


Robots.txt File

Perhaps the simplest and quickest way to see your website fall off the rankings as part of a negative attack is for the hacker to change your robots.txt (or create one if it doesn’t exist). Changing the file to include the line Disallow: / informs all the search engine bots to ignore the site and not to index it in the SERPs.

You will notice a dramatic fall in traffic quickly afterwards.


Hacking The Website

Simply hacking a website which is the target of the attack, is enough to see it fall down the rankings. If Google finds a hacked website, they can apply a manual penalty like the example below which affects the entire website.

Google manual action applied to a hacked website

In other cases, Google can display a message on the SERPs underneath your website listing:

Google SERPS showing Your Website May be Hacked

This seriously puts off anyone from clicking on the SERP link, therefore reducing the click through rate to your website. And with no visitors arriving at your website, your leads and sales will decline.


NoFollow Links Revisited

I said earlier that I would come back to rel=nofollow links. I am outlining a hypothetical scenario where changing mass rel=nofollow to rel=follow links could create a negative SEO attack at will:

Situation 1 – The Sleeping Network

It may be the case that a network of websites/farms are generated over time to create a spider’s web of backlinks. Each of these links look innocent, in that the keyword optimised anchor text hyperlink contains a rel=nofollow tag. To Google, the webmaster and any SEO consultant, these are not passing any PageRank and comply with Google’s guidelines (as above), so they are ignored.


Situation 2 – The Hacked Network

It may well be that genuine article websites, press release websites, blog websites – all with optimised keyword anchor text link back to a competitors website with rel=nofollow tags. These may have been earned or built by the website owner or SEO agency by ethical means and are of value in helping promote the website on the internet.

In the event that either or both of these situations flipped the rel=nofollow to rel=follow, this would create a mass influx of PageRank passing links to the attacked website and this would appear unnatural to Google.

As for the website under the Negative SEO attack, they are unlikely to notice that the websites that were previously linking back to them suddenly became silent attackers. As the links still exist but this time they were passing link juice.

Under situation 2, many of these style of websites are built using WordPress which is known to be prone to hacking if not properly managed. Usually, when they are hacked, it is to push out malware, provide secret hiding places for cracked passwords or serial keys, or to promote obvious backlinks to porn websites and the like. It would not be clearly obvious to the WordPress owners if their sites were hacked and all that was deleted was the word “no”.


Don't Leave It Too Late?

Proactively guarding against negative SEO attacks is the best approach to safeguard your online business.

If you need help in discovering if your website is subject to a negative SEO attack, need a strategic plan to recover, or want proactive SEO services, then please get in touch.

I would love you to join the conversation below and tell me and the visitors about your views on negative SEO, if you have first-hand experience of a negative SEO attack, please share!

About the author

David Reid

David Reid

David previously worked with Europe’s No.1 Marketing Agency & the World’s No.6 biggest Integrated Marketing Agency, seeing first-hand how £150 million+ campaigns are executed to create household brands. Now he is the CEO of a specialist marketing agency helping companies of all sizes.

comments powered by Disqus

50% Off Websites

Are you getting more sales than you can handle?

No. Then we can help you.

With our marketing sites, we can give you a site for 50% less than us building a custom one for you.

Every website comes with our latest marketing features built-in.

These will help you get noticed and get the sales you are after.

Get More Sales

VastClicks logo image for content curation

Share content to win more sales. It's very simple.

You don't need to create your own content to drive more visitors to your website.

With VastClicks you can use content from other websites to win more clients.

Let's Work Together

Every website should have a strong call-to-action, well this is ours. You have arrived at our site because you are looking for something, hopefully help. Click the Get In Touch button now, tell us what you need, and we will help you.