Dumps/Adds-changes dumps

From Wikitech
Jump to: navigation, search

Adds/changes dumps overview

We have an experimental service available which produces dumps of added/changed content on a daily basis for all projects that have not been closed and are not private.

The code for this service is available in our git repository (master branch). It relies on the python modules used by the regular dumps, at in the regular dumps repo.

The job runs out of cron on one of the snapshot hosts (see hiera/hosts for which one), as the datasets user. Everything except initial script deployment is puppetized. Scripts are deployed via scap3 as part of the general Dumps deployment.

Directory structure:

Everything for a given run is stored in dumproot/projectname/yyyymmdd/ much as we do for regular dumps.

How it works

We record the largest revision id for the given project, in the file maxrevid.txt, older than a configurable cuttof (currently at least 12 hours old). All revisions between this and the previously recorded revision for the previous day will be dumped. The delay gives editors on the specific wiki some time to have weeded out vandalism, advertising spam and so on.

We generate a stubs file containing metadata in xml format for each revision added since the previous day, consulting the file maxrevid.txt for the previous day to get the start of the range. We then generate a meta-history xml file which contains the text of these revisions grouped together and sorted by page id. Md5 sums of these are available in an md5sums.txt file. A status.txt file is available to indicate whether we had a successful run ("done") or not.

After all wikis have run, we check the directories for successful runs and writes a main index.html file with links for each project to the stub and content files for the latest successful run.

When stuff breaks

You can rerun various jobs by hand for specified dates. Be on the snapshot host responsible (check iera/hosts for the one that runs misc cron jobs). In a screen session, do:

  1. sudo -s datasets
  2. python /srv/deployment/dumps/dumps/xmldumps-backup/generateincrementals.py --configfile /etc/dumps/confs/addschanges.conf --date YYYYMMDD

If you want more information you can rnu the above script with --help for a usage message.

Some numbers

Here's a few fun numbers from the November 23 2011 run. Writing the stubs file for 167985 revisions for en wikipedia took 2 minutes, and writing the revisions text file took 24 minutes. Writing the stubs file for 36272 revisions for de wikipedia took less than a minute, and writing the revisions text file took 5 minutes. Writing the stubs file for 43133 revisions for commons took 1 minute, and writing the revisions text file took 2 minutes.