Skip to content
PxlPeak LogoSerpNap
Crawl Control

Robots.txt Generator

Generate a valid robots.txt file in seconds. Choose a preset or build custom rules for any platform.

robots.txt
11 lines
# robots.txt generated by SerpNap Robots.txt Generator
# Generated at serpnap.com/tools/robots-txt-generator

User-agent: *
Disallow: /admin/
Disallow: /api/
Disallow: /private/
Disallow: /*.json$
Allow: /

Sitemap: https://example.com/sitemap.xml

Upload this file to the root of your website (e.g., https://example.com/robots.txt).

Google typically picks up changes within 24-48 hours. You can also submit it via Google Search Console → Settings → robots.txt.

What is a robots.txt file?

Purpose

A robots.txt file tells search engine crawlers which pages or files they can or can't request from your site. It's not a security mechanism — it's a crawl management tool that helps bots use your crawl budget efficiently.

Where to place it

Always place robots.txt at the root of your domain: https://example.com/robots.txt. Subdomain robots.txt files only apply to that subdomain.

Common mistakes

Don't use robots.txt to hide sensitive pages — crawlers may still index the URL from external links. Use noindex meta tags or HTTP headers instead. Also avoid blocking CSS/JS files that Google needs for rendering.

AI crawlers in 2026

New AI bots like GPTBot, Google-Extended, and ClaudeBot respect robots.txt directives. Use the "Block AI Crawlers" preset above if you want to prevent your content from being used for AI training while still allowing search indexing.

Ready?

Put AI to Work for You.

Book a free 30-minute assessment. We'll map exactly which AI tools will save you time and money — with a clear timeline and pricing.

Free assessmentNo commitmentResults in 2 weeks