Private Indexing Service 1

Private Indexing Service


🕯️RIP Robots.txt Noindex🕯️


This is a tip from my talk at the last ohgmcon which allows you to roll your own private indexing service on a domain you don’t care about so you can use the indexing API for any page on the internet without getting burned.

Method:

1. Register a new domain

2. Register this new domain for the indexing API.

3. Implement an open redirect on the domain:

<?php 
$url_to_redirect = $_GET['really_secure_indexer'];
header('Location: ' . $url_to_redirect, true, 301);
die();
?>

I am not good at computer.

4. Generate URLs for each URL you would like to submit to the Indexing API, e.g. https://please-do-not-do-this-ohgm.co.uk/this-is-secure.php?really_secure_indexer=https://ohgm.co.uk/new-post

5. Submit these in bulk to the indexing API. Googlebot will usually come to visit within a few seconds. If you are blessed with great content the page will be indexed shortly.

Benefits:

The indexing API honours cross-domain redirects, so the URL gets in the queue for indexing sooner than it would otherwise.

This is something that can be used to encourage indexing for any URL on the internet, regardless of who owns it.

Because any domain can redirect to any domain, this method gives things a level of plausible deniability (since you could equally use this for negative SEO)

The worst thing that can happen is the domain you don’t care about loses access to the indexing API. If so, roll up another. You don’t risk your own website.

The advantage here (I think) is that the indexing API has a very high priority modifier to the individual URL compared to things like XML Sitemap submission or HTML sitemaps AND is disposable and replicable. You can still do these things.

This can be valuable for things like:

  • Ensuring external links to your website are counted as soon as possible.
  • Getting new/updated content indexed faster.
  • Speed: submission in bulk vs one-at-a-time URL inspections and index requests in the Search Console interface.
  • Get disavowed links crawled so you can submit a reconsideration request sooner, and get the penalty removed faster.
  • Getting a competitor’s misfortune picked up faster.

I also believe this is more efficient than paid indexing services but there is zero chance of me testing this. I imagine some services are using this technique already.

Top Tips

Open redirects are a really bad idea. Sorry.

Have fun and don’t bend or break any guidelines by making sure the destination are either job postings or livestream events. Please.

( ° ͜ʖ °)


🕯️RIP Robots.txt Noindex🕯️


3 thoughts on “Private Indexing Service”

  1. I don’t know if this is a spam post or not, but “die()” just looks very very sexy at the end of the code. It literally tells me to buy a cheap rope and hang myself. Thank you!

Leave a Reply

Your email address will not be published. Required fields are marked *