Robots.txt cheat sheet

I’ve recently created my very own robots.txt cheat sheet. Because who doesn’t need one in their SEO toolbox, right? After all, I like to be prepared. Who knows what the next log analysis audit might reveal? Some weird URLs that need removing from the index after a website migration? A client emailing you about some pages that shouldn’t appear on Search? No problem. I’m ready.

The Google Sheet includes an extensive list of user-agents, allow/disallow directive examples and sitemap rules.

And, because I like to give resources away to my buddying SEOs, you can find the Google Sheet link below. All ready for you to copy-paste and do with as you wish.

Make a copy of the Google Sheet document by selecting File > Make a Copy.