qb_akkoma/docs/administration/CLI_tasks/robots_txt.md
rinpatch 4fcf272717 Docs: Fix the way tabs are declared
Since python doesn't have a way to lock deps for a particlar project
by default, I didn't bother with it. This resulted in mkdocs updating at
some point, bringing a breaking change to how tabs are declared and
broken tabs on docs-develop.pleroma.social. I've learned my lesson
and locked deps with pipenv in pleroma/docs!5. This MR updates Pleroma
docs to use the new tab style, fortunately my editor did most of it.

Closes #2045
2020-08-15 09:55:59 +03:00

652 B

Managing robots.txt

{! backend/administration/CLI_tasks/general_cli_task_info.include !}

Generate a new robots.txt file and add it to the static directory

The robots.txt that ships by default is permissive. It allows well-behaved search engines to index all of your instance's URIs.

If you want to generate a restrictive robots.txt, you can run the following mix task. The generated robots.txt will be written in your instance static directory.

=== "OTP"

```sh
./bin/pleroma_ctl robots_txt disallow_all
```

=== "From Source"

```sh
mix pleroma.robots_txt disallow_all
```