forked from zesterer/babble
Deny all robots
Currently the robots.txt is set up to allow complete access by robots. This means that well meaning bots that actually respect a sites wishes with regards to crawling will be invited into the maze. I think it makes more sense to tell all robots to go away, and if the robot just blindly ignores this it will get lost in the babble tarpit. Given enough babble instances this means that over time bot creators will write LLM scraping bots that respect robots.txt so that they don't incur the cost to their compute, bandwidth, and ultimately the quality of their model. ``` To exclude all robots from the entire server User-agent: * Disallow: / To allow all robots complete access User-agent: * Disallow: ``` via https://www.robotstxt.org/robotstxt.html
This commit is contained in:
parent
a77eb52c56
commit
3ec4cd8595
1 changed files with 1 additions and 1 deletions
|
@ -1,2 +1,2 @@
|
|||
User-agent: *
|
||||
Disallow:
|
||||
Disallow: /
|
||||
|
|
Loading…
Add table
Reference in a new issue