Tiered linkbuilding works. In short – there are links to your website (either self-created or organic), and using automated means you build links to these links to increase their power, hedge risk to your own domain, and rank. It’s easy to do, and very scalable (and Google hates it). Unfortunately it tends to not be too healthy for the domains linking to you. The domains you are using to launder your links take the brunt of the risk, without their knowledge. Last year I produced some recommendations for a Web 2.0 property being used as a Tier 1 site in tiered linkbuilding.
Many people would be happy with this level of incoming links (~30M linking pages). Unfortunately the majority were going to user generated subdomains, and coming from the sort of sites in auto-approve lists for Scrapebox. The domain was part of way too many fiverr gigs in the SEO section. Skirting around the NDA, we’ll pretend it was Squidoo.
As a side note – the domain wasn’t passing much (if any) value to the spammers, the user-generated pages were all meta robots “noindex,nofollow”. But “noindex,nofollow” isn’t enough. The problem is that most automated methods and the squishy things directing them don’t bother checking for meta-directives, and report that all is well so long as the <a> tag points to the right destination. So, while your directive means they get no value from your domain, it doesn’t actually stop them from producing pages of spun content and blasting comment and wiki links at them for hours.
One part many people seem to forget when discussing Negative SEO is that it works to hamper your SEO efforts even when ineffective, because it adds noise to any analysis. For this reason alone, getting people to stop pointing tons of awful links at your website is a good thing.
Solutions to Consider
Stop people linking externally:
I wouldn’t even float this idea unless your users would have absolutely no reason to want hyperlinks. I think it’s a great way to kill a user-generated section as quickly as possible, but it will work to stop people building tiered links (because by definition, they can’t).
Explicit nofollow for all user-generated external links:
This is better than a meta “noindex,nofollow” for a few reasons. Firstly, more linkbuilding tools pick it up, which means less people will direct the resources to build links to pages on your site. It’s also less likely to find its way into fiverr gigs.
Unfortunately, there is the view that you want a mixture of followed and nofollowed links in order to appear “natural”. As a result, an explicit nofollow on the link element his will discourage most, but not all people who might direct a high volume of unsavory links at your domain.
Break their Spirits: Interstitial holding pages on a blocked subdomain for all external links:
This may unfortunately be the technically best solution for discouraging linkspam to user generated pages. Similar to how some people handle their affiliate links, user-generated external links are automatically updated to a dynamic page on a dedicated subdomain (blocked in robots.txt) where the user is redirected through a Google-unfriendly method (e.g. Meta-refresh).
I link to ohgm.co.uk from my Squidoo page “http://squidoo.com/ohgm”. The link is updated to http://external.squidoo.com/?https://ohgm.co.uk, with an explicit “nofollow” tag in the source code. This page uses a meta-refresh to deliver me to ohgm.co.uk after a short delay. External.Squidoo.com is blocked in robots.txt, and it’s pages set to “noindex,nofollow”.
If I click on the link, I end up on ohgm.co.uk. If I check to see if my link to ohgm.co.uk is still live, I’ll find that it isn’t. If the tool I use to do this just looks for the string “https://ohgm.co.uk” in the source code, hopefully it’s at least going to notice the nofollow attribute. If I’m thinking about doing some tiered link building to ohgm.co.uk, I won’t be using that Squidoo page to do it, since there isn’t actually a link to ohgm.co.uk on it.