Block search engines via robots.txt (#631)

Prevents instances from being rate limited due to being senselessly
crawled by search engines. Since there is no reason to index Nitter
instances, simply block all robots. Notably, this does *not* affect link
previews (e.g. in various chat software).
This commit is contained in:
minus 2022-06-04 15:48:25 +00:00 committed by GitHub
parent 778c6c64cb
commit c543a1df8c
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23

2
public/robots.txt Normal file
View file

@ -0,0 +1,2 @@
User-agent: *
Disallow: /