Itâ€™s a story we hear too often: someone hires a bad SEO, that SEO builds a bunch of spammy links, he/she cashes their check, and then bam â€“ penalty! Whether you got bad advice, â€œyour friendâ€ built those links, or youâ€™ve got the guts to admit you did it yourself, undoing the damage isnâ€™t easy. If youâ€™ve sincerely repented, Iâ€™d like to offer you 6 ways to recover and hopefully get back on Googleâ€™s Nice list in time for the holidays.
This is a diagram of a theoretical situation that Iâ€™ll use throughout the post. Hereâ€™s a page that has tipped the balance and has too many bad (B) links – of course, each (B) and (G) could represent 100s or 1000s of links, and the 50/50 split is just for the visual:
Before you do anything radical (one of these solutions is last-ditch), make sure itâ€™s bad links that got you into trouble. Separating out a link-based penalty from a devaluation, technical issue, Panda â€œpenaltyâ€, etc. isnâ€™t easy. I created a 10 minute audit a while back, but thatâ€™s only the tip of the iceberg. In most cases, Google will only devalue bad links, essentially turning down the volume knob on their ability to pass link-juice. Here are some other potential culprits:
- Youâ€™ve got severe down-time or latency issues.
- Youâ€™re blocking your site (Robots.txt, Meta Robots, etc.).
- Youâ€™ve set up bad canonicals or redirects.
- Your site has massive duplicate content.
- Youâ€™ve been hacked or hit with malware.
Diagnosing these issues is beyond the scope of this post, but just make sure the links are the problem before you start taking a machete to your site. Letâ€™s assume youâ€™ve done your homework, though, and you know youâ€™ve got link problemsâ€¦
In some cases, you could just wait it out. Letâ€™s say, for example, that someone launched an SQL injection attack on multiple sites, pointing 1000s of spammy links at you. In many cases, those links will be quickly removed by webmasters, and/or Google will spot the problem. If itâ€™s obvious the links arenâ€™t your fault, Google will often resolve it (if not, see #5).
Even if the links are your responsibility (whether you built them or hired someone who did), links tend to devalue over time. If the problem isnâ€™t too severe and if the penalty is algorithmic, a small percentage of bad links falling off the link graph could tip the balance back in your favor:
Thatâ€™s not to say that old links have no power, but just that low-value links naturally fall off the link-graph over time. For example, if someone builds a ton of spammy blog comment links to your site, those blog posts will eventually be archived and may even drop out of the index. That cuts both ways â€“ if those links are harming you, their ability to harm will fade over time, too.
Unfortunately, you canâ€™t usually afford to wait. So, why not just remove the bad links?
Well, thatâ€™s the obvious solution, but there are two major, practical issues:
(a) What if you canâ€™t?
This is the usual problem. In many cases, you wonâ€™t have control over the sites in question or wonâ€™t have login credentials (because your SEO didnâ€™t give them to you). You could contact the webmasters, but if youâ€™re talking about 100s of bad links, thatâ€™s just not practical. The kind of site thatâ€™s easy to spam isnâ€™t typically the kind of site thatâ€™s going to hand remove a link, either.
(b) Which links do you cut?
If you thought (a) was annoying, thereâ€™s an even bigger problem. What if some of those bad links are actually helping you? Google penalizes links based on patterns, in most cases, and itâ€™s the behavior as a whole that got you into trouble. That doesnâ€™t mean that every spammy link is hurting you. Unfortunately, separating the bad from the merely suspicious is incredibly tough.
For the rest of this post, letâ€™s assume that youâ€™re primarily dealing with (a) â€“ you have a pretty good idea which links are the worst offenders, but you just canâ€™t get access to remove them. Sadly, thereâ€™s no way to surgically remove the link from the receiving end (this is actually a bit of an obsession of mine), but you do have a couple of options.
If the links are all (or mostly) targeted at deep, low-value pages, you could pull a disappearing act:
In most cases, youâ€™ll need to remove the page completely (and return a 404). This can neuter the links at the target. In some cases, if the penalty isnâ€™t too severe, you may be able to 301-redirect the page to another, relevant page and shake the bad links loose.
If all of your bad links are hitting a deep page, count yourself lucky. In most cases, the majority of bad links are targeted at a site’s home-page (like the majority of any links), so the situation gets a bit uglier.
In some sense, this is the active version of #2. Instead of waiting for bad links to fade, build up more good links to tip the balance back in your favor:
By â€œgoodâ€, I mean relevant, high-authority links â€“ if your link profile is borderline, focus on quality over quantity for a while. Rand has a great post on link valuation that I highly recommendÂ – itâ€™s not nearly as simple as we sometimes try to make it.
This approach is for cases where you may be on the border of a penalty or the penalty isnâ€™t very severe. Fair warning: it will take time. If you canâ€™t afford that time, have been hit hard, or suspect a manual penalty, you may have to resort to one of the next two optionsâ€¦
If youâ€™ve done your best to address the bad links, but either hit a wall or donâ€™t see your rankings improve, you may have to appeal to Google directly. Specifically, this means filing a reconsideration request through Google Webmaster Tools. Rhea at Outspoken had an excellent post recently on how to file for reconsideration, but a couple of key points:
- Be honest, specific and detailed.
- Show that youâ€™ve made an effort.
- Act like you mean it (better yet: mean it).
If Google determines that your situation is relevant for reconsideration (a process which is probably semi-automated), then itâ€™s going to fall into the hands of a Google employee. They have to review 1000s of these requests, so if you rant, provide no details, or donâ€™t do your homework, theyâ€™ll toss your request and move on. No matter how wronged you may feel, suck it up and play nice.
If all else fails, and youâ€™ve really burned your home to the ground and salted the earth around it, you may have to move:
Of course, you could just buy a new domain, move the site, and start over, but then youâ€™ll lose all of your inbound links and off-page ranking factors, at least until you can rebuild some of them. The other option is to 301-redirect to a new domain. Itâ€™s not risk-free, but in many cases a site-to-site redirect does seem to neuter bad links. Of course, it will very likely also devalue some of your good links.
Iâ€™d recommend the 301-redirect if the bad links are old and spammy. In other words, if you engaged in low-value tactics in the past but have moved on, a 301 to a new domain may very well lift the penalty. If youâ€™ve got a ton of paid links or youâ€™ve obviously built an active link farm (thatâ€™s still in play), you may find the penalty comes back and all your efforts were pointless.
Iâ€™d like to end this by making a suggestion to Google. Sometimes, people inherit a bad situation (like a former SEOâ€™s black-hat tactics) or are targeted with bad links maliciously. Currently, there is no mechanism to remove a link from the target side. If you point a link at me, I canâ€™t say: â€œNo, I donâ€™t want it.â€ Search engines understand this and adjust for it to a point, but I really believe that there should be an equivalent of nofollow for the receiving end of a link.
Of course, a link-based attribute is impossible from the receiving end, and a page-based directive (like Meta Robots) is probably impractical. My proposal is to create a new Robots.txt directive called â€œDisconnectâ€. I imagine it looking something like this:
Essentially, this would tell search engines to block any links to the target site coming from â€œwww.badsite.comâ€ and not consider them as part of the link-graph. Iâ€™d also recommend a wild-card version to cover all sub-domains:
Is this computationally possible, given the way Google and Bing process the link-graph? I honestly donâ€™t know. I believe, though, that the Robots.txt level would probably be the easiest to implement and would cover most cases Iâ€™ve encountered.
While I recognize that Google and Bing treat bad links with wide latitude and recognize that site owners canâ€™t fully control incoming links, Iâ€™ve seen too many cases at this point of people who have been harmed by links they donâ€™t have control over (sometimes, through no fault of their own). If links are going to continue to be the primary currency of ranking (and that is debatable), then I think itâ€™s time the search engines gave us a way to cut links from both ends.
Update (December 15th)
From the comments, I wanted to clarify a couple of things regarding the “Disconnect” directive. First off, this is NOTÂ an existing Robots.txt option. This is just my suggestion (apparently, a few people got the wrong idea). Second, I really did intend this as more of a platform for discussion. I don’t believe Google or Bing are likely to support the change.
One common argument in the comments was that adding a “Disconnect” option would allow black-hats to game the system by placing risky links, knowing they could be easily cut. While this is a good point, theoretically, I don’t think it’s a big practical concern. The reality is that black-hats can already do this. It’s easy to create paid links, link farms, etc. that you control, and then cut them if you run into trouble. Some SEO firms have even built up spammy links to get a short-term boost, and then cut them before Google catches on (I think that was part of the JC Penney scheme, actually).
Almost by definition, the “Disconnect” directive (or any similar tool) would be more for people who can’t control the links. In some cases, these may be malicious links, but most of the time, it would be links that other people created on their behalf that they no longer have control over.
Article source: http://www.seomoz.org/blog/6-ways-to-recover-from-bad-links