robots.txt on Servebolt

A standard robots.txt file is automatically delivered in both production and dev/stage/test environments, if no robots.txt file exists in the public folder.

robots.txt in dev/stage and test environments

In testing environments a robots.txt file is delivered to prevent robots from indexing the testing domain. The robots.txt file urges all robots to not index anything from the domain.

Please note that you cannot edit the robots.txt file for internal URLs. This is a security measure to prevent your unpublished changes from being accidentally indexed by search engines.

robots.txt for production environments

In Production environments, a standard robots.txt file is delivered if the file does not exist. 

This standard robots.txt includes a single policy that tells robots to not index faster than one page per second. 

Dynamic delivery of robots.txt

If our standard robots.txt is in the way for your application, you can work around this by adding a rewrite rule after your standard rewrite to index.php and add an empty robots.txt to the public directory.

RewriteCond %{REQUEST_URI}  ^/robots.txt$
RewriteRule ^ index.php [L]

An example of this in action for Drupal:

The same method works for WordPress, add the two rewrite lines for robots.txt after your standard WordPress .htaccess block.