This post is the spiritual successor to How to Fetch and Render (almost) Any Site, a blog post I did not write but wish I had.
The new version of the URL Inspection tool is very useful for diagnosing issues for Googlebot’s rendering peculiarities:
Crucially, it includes the result of JS execution:
However, if you don’t have access to a Search Console property, you cannot inspect the URL:
I’ll keep this brief:
You can bypass this restriction with cross domain redirects.
Paying homage to the progenitor:
This is the Search Console account for ohgm.co.uk, accessing data that’s intentionally limited to the screamingfrog.co.uk Search Console account. Here’s another:
The method is simple:
- Have a Search Console account for a website you control.
- Set up a redirect to the desired URL on a property you don’t control.
- Inspect the redirected URL in Search Console.
- Test Live URL.
You don’t get to choose User-Agent like with fetch and render, but the old Fetch and Render feature has never allowed redirects to be rendered anyway:
URL Inspect uses the User-Agent that indexes the website, and in most cases this will be Googlebot Smartphone. The important takeaway is that we can render what we want using genuine Googlebot.
How To View More Than a Preview
The Rendered Source provided can be copied and pasted out (unlike some other Google tools), so even though the preview only provides a limited above the fold snapshot, you can still make use of it. To do this:
- Copy the HTML from the tested page.
- Inspect Element in Chrome.
- Edit as HTML.
- Overwrite by pasting in what Googlebot’s rendered.
- Hit enter to view the full rendered webpage.
I’m doing this with Chrome 41 for that musty smelling authenticity (and to head-off some comments). This technique doesn’t just apply to other people’s websites, it can be used to note the rendering differences with your own (especially below the fold). It can be really useful.
If the destination is blocked in robots.txt, this won’t work. There are probably other restrictions (especially if they decide to fix this), but I’m too excited by the possibilities to investigate them now.
Thankfully there don’t seem to be limits on the frequency with which URLs can be tested and retested (yet) so this can be automated. Perhaps more worryingly:
This can be used to submit other people’s content for indexing
Nothing exploitable there.
This technique is potentially useful for sales, as it can give you information you shouldn’t yet have access to (e.g. by looking at the UA, you can see if a site is/isn’t under mobile first indexing). It’s a great way to quickly see if there are any blocked resource problems and to get started while waiting to be granted Search Console access.
Since it’s genuine Googlebot making the request from a genuine Googlebot IP address, you can use it to bypass most blocking. I can confirm this bypasses paywalls which allow Googlebot to crawl (e.g. most news websites).
- If you suspect a competitor is cloaking, this will show you.
- If you want to check your cloak is working properly, this will show you.
I’m sure you’ll have some ideas for how to use this. Please share any devilish ones in the comments.
This only has slim advantages over using the mobile friendly testing tool, but I’ll take whatever I can get.