Explains how the robots.txt files work and are delivered on Servebolt

A standard robots.txt file is made in both production and dev/stage/test environments if no robots.txt file exists.

Dev/Stage/Test environments

In testing environments a robots.txt file is delivered to prevent robots from indexing the testing domain. 

Production environments

In production environments, a standard robots.txt file is delivered if the file does not exist. 

This standard robots.txt includes a single policy that tells robots to not index faster than one page per second. 

Dynamic delivery of robots.txt

If our standard robots.txt is in the way for your application, you can work around this by adding this rule, after your standard rewrite to index.php.

RewriteCond %{REQUEST_URI}  ^/robots.txt$RewriteRule ^ index.php [L]

An example of this in action for Drupal:

The same method works for WordPress, add the two rewrite lines for robots.txt after your standard WordPress .htaccess block.

Did this help you?