I’m a big fan of content:
I’ll do a full writeup soon – I’ll be at #mozcon if anyone wants to talk to me about this post in person.
This is a case I encountered recently, and I struggled with for a while. To keep things quick - I was working with a fairly baroque faceted navigation which had attracted a substantial amount of external links to category URLs containing tracking parameters.
The canonical tag was holding everything together link equity wise, but the crawl inefficiency was staggeringly bad. While we could use robots.txt directives, this would likely kill the site's organic performance - no crawl, no canonical, no value passing external links.
One final resort option (and I mean final) is Google's Webmaster Tools parameter configuration. You make Googlbot explicitly aware of what each parameter does (or does not), so that they may crawl your domain more efficiently. As these were tracking parameters, the [...]
On March 20th, 2014 (hint), I received the following message for one of my sites:
Unnatural outbound links. Excellent. To be clear, this domain does have outbound links that were probably placed with the intent of manipulating PageRank. It's irredeemably bad, a running joke. Google did the right thing penalising it. I've been waiting for this.
Well, I've never seen an outbound link penalty before, and I'd like to see if I can shift the penalty using a technique that probably shouldn't work.
My New Robots.txt
This took a few minutes to implement. I think it's to the letter of the law, but not quite the spirit - I'm skipping redirecting external links to blocked intermediate pages and am just blocking them for all useragents at source. As Googlebot [...]
We don't test as much as we think we should, but we hold opinions pretty strongly all the same. I'm not interested so much in the truth of these questions as I am in any consensuses that might arise in the answers. I plan to keep adding to this post whenever I think of more questions I find interesting - drop me a line if you have any suggestions and I'll put them up...
Preserving Link Equity
Those are the opinions of your contemporaries. Let them know how wrong they are below. [...]
This method works best with anyone who sells uniquely named products. It can work for people drop shipping, and affiliates with decent deals. What we aim to do with this method is fairly simple. In an ideal world, we would be able to find people talking about having purchased a product your client sells from your client. As it's not an ideal world, we're going to settle for "web pages that have mentioned both a product your client sells and your client".
Get a complete list of products your client sells. You should be able to get this either from your client or by scraping using something like Screaming Frog.
Once you have these product names, fire up Scrapebox. Input [client name variations] as your footprint, and [product names] as your keywords to scrape, limiting the scope [...]