Robots.txt Overview

The SEO > Robots.txt section is used to edit crawler access rules directly from the CMS panel.

From the current screen, you can:

  • Update robots.txt directives in editor
  • Save draft changes
  • Publish robots.txt changes to site

Current Interface Overview

The current Robots.txt page contains:

  • Text editor for robots directives (User-agent, Allow, Disallow, etc.)
  • Save button
  • Publish button

Current Robots.txt Editor

Step-by-Step: Update Robots.txt

  1. Open SEO > Robots.txt.
  2. Edit directives in text area.
  3. Click Save to store changes.
  4. Click Publish to apply updated robots file.

The current screenshot shows practical examples such as:

  • User-agent: *
  • User-agent: Googlebot
  • User-agent: Googlebot-Image
  • User-agent: Googlebot-Video
  • User-agent: Googlebot-Mobile
  • allow:/
  • disallow: /adminaaaaa/

Use this carefully, especially for Disallow rules, to avoid blocking important pages from crawlers.

  1. Validate syntax before publishing.
  2. Keep a backup of previous robots directives.
  3. Avoid broad Disallow: / unless intentional.
  4. Save first, then publish.
  5. Re-check the live robots.txt output after publishing.

Notes

  • Incorrect robots rules can impact SEO indexability and traffic.
  • Prefer minimal and targeted disallow rules.