akkoma/docs/docs/administration/CLI_tasks/robots_txt.md
floatingghost d2a185c013
Some checks are pending
ci/woodpecker/push/docs Pipeline is pending
ci/woodpecker/push/release Pipeline is pending
ci/woodpecker/push/test Pipeline is pending
Documentation updates for stable release (#73)
Reviewed-on: #73
2022-07-15 12:27:16 +00:00

22 lines
644 B
Markdown

# Managing robots.txt
{! administration/CLI_tasks/general_cli_task_info.include !}
## Generate a new robots.txt file and add it to the static directory
The `robots.txt` that ships by default is permissive. It allows well-behaved search engines to index all of your instance's URIs.
If you want to generate a restrictive `robots.txt`, you can run the following mix task. The generated `robots.txt` will be written in your instance [static directory](../../../configuration/static_dir/).
=== "OTP"
```sh
./bin/pleroma_ctl robots_txt disallow_all
```
=== "From Source"
```sh
mix pleroma.robots_txt disallow_all
```