From Wikitech
(Redirected from Tool:Xtools)
Jump to navigation Jump to search
Toolforge tools
Description Suite of tools to analyze user and page data on WMF wikis
Keywords xtools, statistics, analytics
Author(s) Matthewrbowker, MusikAnimal, Samwilson
adapted from older version by X! and Hedonil
Maintainer(s) Matthewrbowker, MusikAnimal, Samwilson (View all)
Source code GitHub (Mirrored on Phabricator as rXTR)
License GNU General Public License 3.0 or later
Issues Open tasks · Report a bug
Admin log Nova Resource:Tools.xtools/SAL
Nova Resource:Tools.xtools-dev/SAL

This page is for documentation relating to the WMF installations of XTools at (production) and (staging). For general installation, configuration, and development of the XTools software, please see

There are three instances currently configured, one for staging, one for the main production app server, and one for the API (view details in the Openstack browser). The prod instances relate to the Toolforge account xtools, and the staging instance relates to xtools-dev; these are where the matching database users come from, and where we send maintainers' emails.

Note to maintainers: we don't backup any server configuration, so please document everything here (until task T170514 is resolved).


The maintainers can be emailed at (note that this means that the maintainers of three separate things need to be kept in sync: the VPS account and the two Toolforge accounts).


Production XTools is hosted on on a Wikimedia VPS instance. To log into the server, make sure you've been added as a maintainer to the xtools project. Then set up SSH and connect via ssh xtools-prod06.xtools.eqiad.wmflabs and go to the /var/www directory. Not quite everything in this directory is in the Git repository.

Logs are written to /var/www/var/logs/prod.log, but only during a request where an error or high-priority log entry was made. This is why you'll see debug-level log entries in prod.log. You might also need to check /var/log/apache2/error.log for Apache-level errors.

Web access stats are available at

OAuth consumer: XTools 1.1

Database is s51187__xtools_prod on tools.labsdb, we're connecting as user s51187 (which is the same as the old XTools database user).

Web server configuration is all in /etc/apache2/sites-available/xtools.conf.

There's a /var/www/ script that runs every 10 minutes (from www-data's crontab) and if required updates to the latest release. The output of this is mailed to the maintainers.

There is also a dedicated API server, which lives at xtools-prod07.xtools.eqiad.wmflabs. All requests to go to this server.

Building a new instance

First create a new instance running on Debian Buster. Any production node should be at least a m1.medium with 4 GB of RAM, but for the main app server a m1.large is probably best.

Once the instance has been spawned, SSH in and follow these steps:

  1. Install PHP and Apache, along with some dependencies (using a Ondřej Surý Debian package).
    sudo apt-get update
    sudo apt-get install -y apache2 php7.4 php7.4-cli php7.4-common php7.4-curl php7.4-json php7.4-mysql php7.4-intl php7.4-xml php7.4-mbstring libapache2-mod-php7.4 zip unzip php7.2-zip php7.2-apcu
    sudo a2dismod mpm_event && sudo a2enmod mpm_prefork && sudo a2enmod php7.4
  2. Install Node and npm.
    sudo apt-get install -y nodejs
    sudo apt-get install -y npm
  3. Install the desired Node version:
    sudo npm install -g n
    sudo n 12.20.1
  4. Install composer by following these instructions, but make sure to install to the /usr/local/bin directory and with the filename composer, e.g.:
    sudo php composer-setup.php --install-dir=/usr/local/bin --filename=composer
  5. Clone the repository, first removing the html directory created by Apache.
    cd /var/www && sudo rm -rf html
    sudo git clone .
  6. Run sudo .env .env.local and fill in the necessary details, using mw:XTools/Development/Configuration as a guide. For most options you can use the default. In particular, be sure APP_ENV is set to prod (even for a staging server).
  7. Run sudo composer install --no-dev --optimize-autoloader, entering yes if you get a warning about running as root. Moving forward, we won't ever run composer as root, but rather the Apache server user, www-data (see steps #9 and #14 below).
  8. Create the deploy script at /var/www/ with the following:
    cd /var/www
    git fetch --quiet origin 2>&1
    ## Find the highest and current tags
    HIGHEST_TAG=$(git tag | sort --version-sort | tail --lines 1)
    CURRENT_TAG=$(git describe --tags)
    ## Exit and say nothing if we're already at the highest tag.
    if [[ $CURRENT_TAG == $HIGHEST_TAG ]]; then
        # The following line can be temporarily uncommented as-needed
        # to force www-data to clear the production cache:
        # ./bin/console cache:clear --env prod
        exit 0
    ## If there's an update, pull and install it.
    git checkout $HIGHEST_TAG
    /usr/local/bin/composer install --no-dev --optimize-autoloader
    ./bin/console cache:clear --env prod
    ./bin/console doctrine:migrations:migrate --env prod --no-interaction
  9. Make sure the scripts are executable, and that all the files in the repo are owned by www-data.
    sudo chmod 744
    sudo chown -R www-data:www-data .
  10. Create the web server configuration file at /etc/apache2/sites-available/xtools.conf with the following:
    <VirtualHost *:80>
            DocumentRoot /var/www/public
            # These requests aren't logged by Apache.
            SetEnvIf Request_URI "(^/robots\.txt$|^\/api\/|^\/images\/|^\/assets\/|^\/i18n\/)" dontlog=yes
            # Requests with these user agents are denied.
            SetEnvIfNoCase User-Agent "(uCrawler|Baiduspider|CCBot|scrapy\.org|kinshoobot|YisouSpider|Sogou web spider|yandex\.com\/bots|twitterbot|TweetmemeBot|SeznamBot|datasift\.com\/bot|Googlebot|Yahoo! Slurp|Python-urllib|BehloolBot|MJ13bot|SemrushBot|facebookexternalhit|rcdtokyo\.com|Pcore-HTTP|yacybot|ltx71|RyteBot|bingbot|python-requests|Cloudflare-AMP|Mr\.4x3|MSIE 7\.0; AOL 9\.5|Acoo Browser|AcooBrowser|MSIE 6\.0; Windows NT 5\.1; SV1; QQDownload|\.NET CLR 2\.0\.50727|MSIE 7\.0; Windows NT 5\.1; Trident\/4\.0; SV1; QQDownload|Frontera|tigerbot|Slackbot|Discordbot|LinkedInBot|BLEXBot|filterdb\.iss\.net|SemanticScholarBot|FemtosearchBot|BrandVerity|Zuuk crawler|archive\.org_bot|mediawords bot|Qwantify\/Bleriot|Pinterestbot|EarwigBot|Citoid \(Wikimedia|GuzzleHttp|PageFreezer|Java\/|SiteCheckerBot|Re\-re Studio|^R \(|GoogleDocs|WinHTTP|cis455crawler|WhatsApp|Archive\-It|lua\-resty\-http|crawler4j|libcurl|dygg\-robot|GarlikCrawler|Gluten Free Crawler|WordPress|Paracrawl|7Siters|Microsoft Office Excel|msnbot|AhrefsBot|MauiBot|Linespider|Symfony BrowserKit|AppleNewsBot|Go-http-client|CoolToolName|UsedBaseLibrary|Archive Team|WoTBoT|Rustbot|ApeSearch|^curl\/|aiohttp)" bad_bot=yes
            CustomLog ${APACHE_LOG_DIR}/access.log xtools expr=!(reqenv('bad_bot')=='yes'||reqenv('dontlog')=='yes'
            CustomLog ${APACHE_LOG_DIR}/denied.log xtools expr=(reqenv('bad_bot')=='yes')
            CustomLog ${APACHE_LOG_DIR}/attacks.log xtools expr=(reqenv('attacker')=='yes')
            ErrorLog ${APACHE_LOG_DIR}/error.log
            AllowEncodedSlashes On
            <Directory /var/www/public/>
                 Options Indexes FollowSymLinks
                 AllowOverride All
                 Require all granted
            <Directory /var/www/>
                    Options Indexes FollowSymLinks
                    AllowOverride None
                    Require all granted
                    Deny from env=bad_bot
            Alias /awstatsclasses "/usr/share/awstats/lib/"
            Alias /awstats-icon/ "/usr/share/awstats/icon/"
            Alias /awstatscss "/usr/share/doc/awstats/examples/css"
            ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/
            ScriptAlias /awstats/ /usr/lib/cgi-bin/
            <Directory /usr/lib/cgi-bin/>
                    Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch
                    Require all granted
            ErrorDocument 403 "Your access to XTools has been blocked due to apparent abuse or disruptive automation. If you are a bot, please use our public APIs instead, which are optimized for this purpose: <>. For inquiries, please contact"
            RewriteCond "%{HTTP_REFERER}" "^http://127\.0\.0\.1:(5500|8002)/index\.html" [NC]
            RewriteRule .* - [R=403,L]
            RewriteCond "%{HTTP_USER_AGENT}" "^[Ww]get"
            RewriteRule .* - [R=403,L]
            RewriteEngine On
            RewriteCond %{HTTP:X-Forwarded-Proto} !https
            RewriteRule ^/?(.*) https://%{SERVER_NAME}/$1 [R=301,L]
  11. Setup awstats (this step is optional, but can provide useful statistics on which endpoints are hit the most, which browsers are the most popular, etc.):
    sudo apt-get install awstats
    cd /etc/awstats
    sudo cp awstats.conf
    sudo a2enmod cgi
  12. Enable the mod-rewrite Apache module, and enable the web server configuration.
    sudo a2enmod rewrite
    sudo a2ensite xtools
    sudo service apache2 reload
  13. Add log rotation to Symfony's logs by creating the file /etc/logrotate.d/symfony with:
    /var/www/var/logs/*.log {
            su www-data www-data
            rotate 14
            create 640 root adm 
                    if /etc/init.d/apache2 status > /dev/null ; then \
                        /etc/init.d/apache2 reload > /dev/null; \
                    if [ -d /etc/logrotate.d/httpd-prerotate ]; then \
                            run-parts /etc/logrotate.d/httpd-prerotate; \
                    fi; \
  14. Setup the crontab to run the deploy script every 10 minutes and the cache clearing script every hour. We'll do this under the www-data user:
    sudo crontab -e -u www-data
    Then add this to the bottom of the crontab:
    */10 * * * * /var/www/
    0,30 * * * * sudo /usr/lib/cgi-bin/ -update > /dev/null
  15. Wait for the email indicating composer ran successfully. If all goes well, you need only to gracefully (re)start Apache:
    sudo service apache2 graceful

Setting up an API server

The API server itself can be built the same as the app server, with some additional proxy settings on the main app server so that all requests to /api go to the API server. You can this by following these steps:

  1. Install libxml2-dev
    sudo apt-get install libxml2-dev
  2. Enable the necessary modules (if some are already enabled it will simply make sure they are active):
    sudo a2enmod proxy proxy_http proxy_ajp rewrite deflate headers proxy_balancer proxy_connect proxy_html xml2enc
  3. And in /etc/apache2/sites-available/xtools.conf, within the <VirtualHost> block, add this to the bottom:
    ProxyPreserveHost On
    ProxyPass /api http://X.X.X.X:80/api nocanon
    ProxyPassReverse /api http://X.X.X.X:80/api
    ...replacing X.X.X.X with the IP of the API server.
  4. Finally, restart apache with sudo service apache2 graceful

Note that the API server is not accessible at its own domain name.


Sometimes weird things happen. Here are some common problems and quick solutions:

  • Errors about missing cache files – It's a mystery why this happens, but the quick easy fix is to delete the prod cache directory, and it will rebuild on its own.
    cd /var/www && sudo rm -rf var/cache/prod/ && sudo chown -R www-data:www-data . (the chown is for safe measure)
  • ServiceUnavailableHttpException – Something is hammering XTools, hogging up our database quota. This usually only lasts for a minute or two at a time (check the timestamps of the first/last email). If it is persistent, action may be needed:
    1. Check the email for the common user agent.
    2. Grep the Apache logs to make sure there aren't a lot of innocent users with the same UA (sudo cat /var/log/apache2/access.log | grep "Foo")
    3. cd /var/www/ then update config/request_blacklist.yml, optionally using a URI pattern to help avoid affecting innocent users:
                  user_agent: "Foo"
                  # Target only frwiki, and when a non-French language is requested (which is common for scraping bots)
                  uri_pattern: "fr\\.wikipedia.*?\\?uselang=(?!fr)"
    4. Clear the cache with sudo ./bin/console cache:clear --env=prod
    5. For good measure, ensure everything is still owned by www-data (the webserver): sudo chown -R www-data:www-data
    6. You can monitor /var/www/var/logs/blacklist.yml to see when the request blacklist is hit.
For more extreme cases, add to the user-agent blacklist in the apache config (/etc/apache2/sites-available/xtools.conf), then reload the config with sudo service apache2 reload.


SSH to xtools-dev05.xtools.eqiad.wmflabs (see notes above in #Production about getting access).

Database is s53003__xtools_dev on tools.labsdb, we're connecting as user s53003.

OAuth consumer: xtools-dev 1.1

The code on the staging server is kept up to date with the main branch with the following /var/www/ script (run every 10 minutes from www-data's crontab):


cd /var/www

## See if there's any update.
GITFETCH=$(git fetch && git diff origin/master 2>&1)
if [ -z "$GITFETCH" ]; then
  exit 0

## If there's an update, pull and install it.
git checkout main
git pull origin main
/usr/local/bin/composer install --no-dev --optimize-autoloader
./bin/console cache:clear --env prod
./bin/console doctrine:migrations:migrate --env prod --no-interaction

Setting up the staging server is the same as production, except you would use a smaller box (m1.small), and use the above deploy script instead of the one that goes off of tags. You also need to update the ServerName in the Apache configuration accordingly.

See also