The threat of negative SEO is remote but daunting. How easy is it to for a competitor to ruin your rankings, and how do you protect your site? But before we start, let’s make sure we’re clear on what negative SEO is, and what it definitely isn’t.
Negative SEO is a set of activities aimed at lowering a competitor’s rankings in search results. These activities are more often off-page (e.g., building unnatural links to the site or scraping and reposting its content); but in some cases, they may also involve hacking the site and modifying its content.
Negative SEO isn’t the most likely explanation for a sudden ranking drop. Before you decide someone may be deliberately hurting your rankings, factor out the more common reasons for ranking drops. You’ll find a comprehensive list here.
This kind of negative SEO targets the site without internally interfering with it. Here are the most common shapes negative off-page SEO can take.
One or two spammy links likely won’t hurt a site’s rankings. That’s why negative SEO attacks usually involve building links from a group of interconnected sites, or link farms. Typically, most of these links use the same anchor text. These exact-match anchors may be completely unrelated to the site under attack; or they might include a niche keyword to make the site’s link profile look like the owner is manipulating it.
A while ago, this happened to WP Bacon, a WordPress podcast site. Over a short period of time, the site acquired thousands of links with the anchor text “porn movie.” Throughout 10 days, WP Bacon fell 50+ spots in Google for the majority of keywords it ranked for. This story has a happy ending though: the webmaster disavowed the spammy domains, and eventually, WP Bacon recovered most of its rankings.
How to stay safe: Preventing a negative SEO attack isn’t something in your power, but spotting the attempt early enough to reverse the damage is possible. To do that, you need to regularly monitor link profile growth. SEO SpyGlass, for example, gives you progress graphs for both the number of links in your profile, and the number of referring domains. An unusual spike in either of those graphs is reason enough to look into the links you suddenly acquired.
To actually see the links that made up the spike, go to the Linking Domains(or Backlinks) dashboard in SEO SpyGlass and sort the links by Last Found Date by clicking on the header of the column twice. Look for the links that were found around the same time when the spike on the graph appeared.
If you’ve no idea where the links are coming from, it’s useful to look at their Penalty Risk. Switch to the Link penalty risk tab, select those suspicious backlinks you just discovered, and click Update Link Penalty Risk. In a few minutes, the column should be populated with values on a scale from 0 to 100. It’s a pretty accurate metric to tell if the links are coming from link farms, as, among other things, it looks at the number of linking domains that come from the same IP address or C block.
Lastly, once you’ve identified the spammy links, you can create a disavow file right in SEO SpyGlass. To do that, right-click the backlink/linking domain and select Disavow (make sure to select Entire domain under Disavow mode). Do the same for all unnatural links you spotted. Finally, go to Preferences > Disavow/Blacklist backlinks, review your disavow file, and export it once you’re happy with it.
Scraping your content and copying it across other sites is another way a competitor can ruin your rankings. When Google finds content that is duplicated across multiple sites, it will usually pick only one version to rank. In most cases, Google is clever enough to identify the original piece… unless they find the “stolen” version first. That’s why scrapers often automatically copy new content and repost it straightaway.
How to stay safe: Copyscape is an essential tool if you’re determined to find instances of content duplication. If you do find scraped copies of your content, it’s a good idea to first contact the webmaster asking them to remove the piece. If that’s not effective, you may want to report the scraper using Google’s copyright infringement report.
There are examples of desperate site owners trying to crash a competitor’s site by forcefully crawling it and causing heavy server load. If Googlebot can’t access your site for a few times in a row… you guessed it — you might get de-ranked.
How to stay safe: If you notice that your site has become slow, or, worse, unavailable, a wise thing to do is contact your hosting company or webmaster — they should be able to tell you where the load is coming from. If you know a thing or two about server logs, here are some detailed instructions on finding the villain crawlers and blocking them with robots.txt and .htaccess.
Negative on-page SEO attacks are way more difficult to implement. These involve hacking into your site and changing things around. Here are the main SEO threats a hacker attack can pose.
You’d think you’d notice if someone changed your content, but this tactic can also be very subtle and difficult to spot. As the attacker adds spammy content (usually links) to a site, they often hide it (e.g., under “display:none” in the HTML), so you won’t see it unless you look in the code.
Another possible negative SEO scenario is someone modifying your pages to redirect to theirs. This isn’t a threat for most small businesses, but if your site enjoys high authority and link popularity, it could be someone’s sneaky way to increase their own site’s PageRank, or to simply redirect visitors to their site when they try to access yours. For the site under attack, such redirects aren’t just a temporary inconvenience. If Google finds out about the redirect before you do, they can penalize the site for “redirecting to a malicious website.”
How to stay safe: Regular site audits with a tool like WebSite Auditor are the best way to spot such subtle attacks. To start your first audit, just launch WebSite Auditor and create a project for your site. Whenever you need to re-run the audit, use the Rebuild Projectbutton. As long as you do this regularly, you should be able to spot changes that could otherwise go unnoticed, such as the number of outgoing links on the site or pages with redirects.
To look into those links or redirects in detail, switch to the All Resources dashboard and go through the External Resources section. If you see an unexpected increase in the count of these, look through the list on the right to see where those links point to, and the lower part of the screen for the pages they were found on.
A small change in robots.txt is one alteration that could wreak havoc on your entire SEO strategy. A disallow rule is all it takes to tell Google to completely ignore your website.
There are multiple examples of this online, including this story. A client fired an SEO agency he wasn’t happy with, and their revenge was adding a Disallow: / rule to the client’s robots.txt.
How to stay safe: Regular ranking checks will help you be the first to know should your site get de-indexed. With Rank Tracker, you can schedule automatic checks to occur daily or weekly. If your site suddenly drops from search engines’ results, you’ll see a Dropped note in the Difference column.
When this happens across a big number of keywords, it usually implies a penalty or de-indexation. If you suspect the latter, check the crawl stats in your Google Search Console account and take a look at your robots.txt.
Even if the hacker has no negative SEO in mind, the attack per se can hurt your SEO. Google wants to protect its users, which is why, if they suspect a site has been hacked, they may de-rank it, or at the very least add a “this site may be hacked” line to your search listings.
Would you click on a result like that?
How to stay safe: Negative SEO aside, stepping up your site’s security should be high on your list of priorities for obvious reasons. This topic deserves a post of its own, but you can find some great tips here and here.