Obsolete:robots.txt
(Redirected from Robots.txt)
This page contains historical information. It is probably no longer true.
The rewrite rule
RewriteRule ^/robots.txt$ /w/robots.php [L]
sends all requests for robots.txt through robots.php. This script checks whether a page Mediawiki:robots.txt exists on the wiki. If it exists, the content of that page will be sent first. The file /apache/common/robots.txt will be sent afterwards.
To edit the static file, do the following:
- Edit /home/wikipedia/common/robots.txt
- Run sync-common-file robots.txt
The source of robots.php is available here: <https://gerrit.wikimedia.org/r/gitweb?p=operations/mediawiki-config.git;a=blob;f=live-1.5/robots.php;hb=HEAD>.