Your website is only as good as Google’s picture of it. So, if you’re working on a website under penalty and are actively trying to get it out of penalty (or just trying to preempt future updates), you should do everything you can to make sure Google is up to date with the link profile so that their image reflects reality.
I’ve used the following method for just over two years, though I haven’t seen it getting any serious coverage (though I’m sure it’s quite widely used). In short – get Googlebot to crawl the links that you’ve removed or disavowed.
Penalties and Disavow
The problem is that terrible links aren’t crawled as often as you’d think. Google have been hinting at this in most Webmaster communications since the disavow tool was introduced:
“For a disavowed link to be ‘counted’ it must be crawled.”
I’ve seen multiple failed reconsideration requests succeed with mere resubmission. I believe the best explanation to be that between the first and second submissions there is a sufficient increase in the number of disavowed links crawled.
On this thought, I’ve not seen a disavowed link that’s been submitted first through this method appear in a rejected reconsideration request.
Link Removal
Links you have successfully removed only count if Google has visited them since removal. This seems obvious but it’s worth hammering in. Until Googlebot has recrawled it, it might as well still exist. If you are trying to tighten up the link profile, you should encourage Googlebot to crawl links that are no longer there, and update it’s picture of your site accordingly.
How to Do This:
Right now I can recommend Linklicious; a ‘link indexing service’. Ignore the rhetoric (it’s not for you) and grab the basic plan. It’s $17 a month recurring expense. In my experience this is more than enough for the method outlined in the article, even for a large agency (it also has an API).
Use this link for a staggering $0 off – Linklicious.
Linklicious tracks Googlebot requests to URLs it controls. These URLs then redirect to those you submit, which is about as much assurance of crawl it’s possible for a tool to give you. Linklicious resubmits these URLs to Google and other services until they are crawled.
You’ll want to submit:
Each URL featuring a link you have successfully removed.
At least one live (this is important) URL from each subdomain you have disavowed. You only need one for this to work.
Risks
Junk in, Junk out.
Some people will have reservations with this method. If you’re overzealous in putting together a disavow, or you’ve removed the wrong links, then you’ll see the wrong kind of results all too quickly. If you’re confident in your work, then there’s no reason to worry about letting Googlebot crawl these junk URLs faster than they might naturally.
Alternatives:
There are alternative methods of doing this yourself, especially if you can put your own scripts together. Unfortunately you won’t be able to tell when the resources have been requested by Googlebot unless you’re building something a little more robust, so I’d recommend going with a link indexation service that tracks crawl. I’ve had some luck recently using Link Centaur, which is $9/month for 15,000 links per day.
Any questions? Leave a comment – this method was something of a lightbulb idea for me so I hope it’ll make your life easier.
What about just creating an HTML page of the links? Host it anywhere that’s not your site – blogspot, tumblr, anything. Doesn’t matter if they’re nofollow, just need to get them on Google’s radar.
Definitely, as long as it’s a host that gets crawled enough – you can use sitemaps or rss feeds for this purpose and ‘notify’ the various services.
It’s less instant but I’d be pretty confident over a long weekend so long as the list was short enough. Might need to break it up otherwise.