unnatural links

Fun With Unnatural Outbound Links Penalties

On March 20th, 2014 (hint), I received the following message for one of my sites: Unnatural outbound links. Excellent. To be clear, this domain does have outbound links that were probably placed with the intent of manipulating PageRank. It's irredeemably bad, a running joke. Google did the right thing penalising it. I've been waiting for this. Why? Well, I've never seen an outbound link penalty before, and I'd like to see if I can shift the penalty using a technique that probably shouldn't work. My New Robots.txt User-agent: * Disallow: * This took a few minutes to implement. I think it's to the letter of the law, but not quite the spirit - I'm skipping redirecting external links to blocked intermediate pages and am just blocking them for all useragents at source. As Googlebot [...]
SEOs rating ohgm.co.uk

Polls for SEOs

We don't test as much as we think we should, but we hold opinions pretty strongly all the same. I'm not interested so much in the truth of these questions as I am in any consensuses that might arise in the answers. I plan to keep adding to this post whenever I think of more questions I find interesting - drop me a line if you have any suggestions and I'll put them up...   Preserving Link Equity rel="nofollow" Anchor Text Content Quality   Those are the opinions of your contemporaries. Let them know how wrong they are below. [...]
scrooge mcduck diving

Ecommerce Linkbuilding Through Product Mentions

This method works best with anyone who sells uniquely named products. It can work for people drop shipping, and affiliates with decent deals. What we aim to do with this method is fairly simple. In an ideal world, we would be able to find people talking about having purchased a product your client sells from your client. As it's not an ideal world, we're going to settle for "web pages that have mentioned both a product your client sells and your client". Method: Get a complete list of products your client sells. You should be able to get this either from your client or by scraping using something like Screaming Frog. Once you have these product names, fire up Scrapebox. Input [client name variations] as your footprint, and [product names] as your keywords to scrape, limiting the scope [...]
It's not a pyramid it's a trapezium

Preventing Tiered Link Spam

Tiered linkbuilding works. In short - there are links to your website (either self-created or organic), and using automated means you build links to these links to increase their power, hedge risk to your own domain, and rank. It's easy to do, and very scalable (and Google hates it). Unfortunately it tends to not be too healthy for the domains linking to you. The domains you are using to launder your links take the brunt of the risk, without their knowledge. Last year I produced some recommendations for a Web 2.0 property being used as a Tier 1 site in tiered linkbuilding. Many people would be happy with this level of incoming links (~30M linking pages). Unfortunately the majority were going to user generated subdomains, and coming from the sort of sites in auto-approve lists for Scrapebox. [...]
Rank Cracker

Link Audits with Rank Cracker

Matthew Woodward recently released his free tool "Rank Cracker". Instead of using the software for it's proscribed purpose of making it easier to replicate competitor link profiles, I'm suggesting you consider trying it for link audits. Let's say you're doing a link audit for someone who's been hit for less-than-clean link building. Automated tools are frowned upon. Rank Cracker is supposed to identify links that can be built with automated tools. This is good. How-to First, lie and say you own all the software mentioned: Then, import all your links and let it run: It can take a while, depending on the size of your list. Results should look a little like this, with expandable sections grouped by suspected tool: If you're doing removals, it handily collates email and [...]