Bonnots.online Tool

Robots.txt Generator

Create a search engine friendly robots.txt file for your site.

Common areas to block (WP-Admin, Includes, etc.)

Save this as robots.txt and upload it to your site’s root folder.

What is a Robots.txt File?

A Robots.txt file is a simple text file placed in your website’s root directory. It acts as a “Gatekeeper” for search engine crawlers (like Googlebot or Bingbot). It tells them which pages they are allowed to visit and which they should ignore.

Why should you use a Robots.txt Generator?

  • Save Crawl Budget: Don’t let Google waste time crawling your login pages or backend scripts.
  • Protect Privacy: Hide sensitive directories (like /cgi-bin/ or /tmp/) from appearing in search results.
  • Prevent Indexing Issues: Avoid duplicate content penalties by blocking specific folders.

How to use the Bonnots Robots.txt Generator

  1. User-agent: By default, we use “*”, which applies the rules to ALL search engines.
  2. Disallow Rules: Check the boxes for directories like /wp-admin/. This prevents the login screen from showing up in Google.
  3. Sitemap: Always include your full sitemap URL (e.g., https://yourdomain.com/sitemap.xml). This helps Google find your new pages faster.
  4. Upload: Copy the code, save it as a file named robots.txt, and upload it via FTP or cPanel to your public_html folder.

Pro Tip: Don’t Block Everything!

Be careful when using the “Refuse All” option. If you disallow /, your entire website will disappear from Google search results. Only use this if the site is still under development.