---
title: Robots.txt Genel Bakış
title_en: Robots.txt Overview
description_tr: Güncel CMS editörü ile robots.txt kurallarını düzenleme, kaydetme ve publish etme akışı.
description_en: Manage robots.txt directives from the current CMS editor and publish crawler rules with the save and publish workflow.
order: 20
product: cms
section_tr: SEO
section_en: SEO
owner: CMS Operations Team
lastReviewed: 2026-03-30
productVersion: v1
status: live
cardImage: /img/cms/seo/robots/cms-robots-txt-poster.webp
---
# Robots.txt Overview

The **SEO > Robots.txt** section is used to edit crawler access rules directly from the CMS panel.

From the current screen, you can:

- Update robots.txt directives in editor
- Save draft changes
- Publish robots.txt changes to site

## Current Interface Overview

The current Robots.txt page contains:

- Text editor for robots directives (`User-agent`, `Allow`, `Disallow`, etc.)
- **Save** button
- **Publish** button

![Current Robots.txt Editor](/img/cms/seo/robots/cms-robots-txt-current.webp)

## Step-by-Step: Update Robots.txt

1. Open **SEO > Robots.txt**.
2. Edit directives in text area.
3. Click **Save** to store changes.
4. Click **Publish** to apply updated robots file.

The current screenshot shows practical examples such as:

- `User-agent: *`
- `User-agent: Googlebot`
- `User-agent: Googlebot-Image`
- `User-agent: Googlebot-Video`
- `User-agent: Googlebot-Mobile`
- `allow:/`
- `disallow: /adminaaaaa/`

Use this carefully, especially for `Disallow` rules, to avoid blocking important pages from crawlers.

## Recommended Safe Workflow

1. Validate syntax before publishing.
2. Keep a backup of previous robots directives.
3. Avoid broad `Disallow: /` unless intentional.
4. Save first, then publish.
5. Re-check the live robots.txt output after publishing.

## Notes

- Incorrect robots rules can impact SEO indexability and traffic.
- Prefer minimal and targeted disallow rules.
