WikimediaDebug is a set of tools for debugging and profiling MediaWiki web requests in a production environment.
You can use WikimediaDebug through the accompanying browser extension, or from the command-line using the
So long as WikimediaDebug is on, all your requests will be handled as a Varnish cache miss.
To make a web request to one of the debug servers, click the icon, select the right server (if you’re not deploying a change yourself, the deployer should have told you which server to select), then toggle the on/off switch. When the extension is enabled, the button should show an “On” badge.
Force Varnish to skip the cache and pass request to mwdebug1001.eqiad.wmnet:
$ curl -H 'X-Wikimedia-Debug: backend=mwdebug1001.eqiad.wmnet' https://meta.wikimedia.org/wiki/Main_Page
Same as above, but profile a request using Tideways and publish the profile to XHGui:
$ curl -H 'X-Wikimedia-Debug: backend=mwdebug1001.eqiad.wmnet; profile' https://meta.wikimedia.org/wiki/Main_Page
Request a page with MediaWiki configured for read-only mode:
$ curl -H 'X-Wikimedia-Debug: backend=mwdebug1001.eqiad.wmnet; readonly' https://meta.wikimedia.org/wiki/Main_Page
The following attributes can influence the backend request:
backend=…: Choose the mwdebug server that Varnish will route your request. It can be set to
1in Beta Cluster where the only backend acts as both appserver and mwdebug server.
forceprofile(Inline profile): Capture a trace profiler and append it to the web response.
log(Verbose logs): Verbosely enable all MediaWiki debug log groups, submitted to Logstash for querying.
profile(XHGui): Record a trace for performance analysis and publish the result to XHGui.
readonly: Read-only mode requests a page from MediaWiki with read-only mode enabled, to simulate how your code behaves when the database is read-only due to replica lag or scheduled maintenance.
shorttimeout: Simulate how your code handles execution timeouts (e.g. from mediawiki/libs/RequestTimeout, T293568). When enabled,
$wgRequestTimeLimitis set to 2s. This option is available via command-line use only.
You can simulate how your code behaves in a secondary data center (Multi-DC), by selecting a host in a DC other than the current primary. You can use either of the datacenters through this list at any time, even if that datacenter is currently inactive or depooled. If you pick a backend in a datacenter that is not currently the primary DC, some actions may be read-only, disabled or slower.
It's now also possible to reach an experimental MediaWiki installation running on our Kubernetes cluster. Beware that some of the features described on this wiki page may not be available yet via this backend. Please choose a different backend unless you specifically need to verify something on Kubernetes.
The following application servers are dedicated to WikimediaDebug use:
Enable the "XHGui" option (
profile attribute) to capture a trace profile with Tideways-XHProf, and publish the result to our XHGui instance. If you are using the browser extension, there will be a link in the footer with the label "Find in XHGui". Clicking on this will take you directly to a list of profiles matching your Request ID. Then click on the sharable permalink (e.g. the timestamp link, or "GET" method name) to show the recorded profile.
- Overview dashboard (e.g. listing functions that used the most time and/or most memory).
- Complete call tree with timing, counts, sub/parent for each function.
- Callgraph visualisation.
To explore these and other XHGui features, open this example profile.
Plaintext request profile
WikimediaDebug can deliver a plaintext-format profile directly from the web server. This is an alternative to the XHGui option above.
Enable the "Inline profile" option (
forceprofile attribute) on any MediaWiki web request (including index.php, api.php, and load.php), and MediaWiki will append a plain text profile to the web response.
To see this in action, open this example URL, enable WikimediaDebug with the "Inline profile" option, and reload the browser tab. Toward the end of the response should be a call-graph summary, that should look roughly like this:
/* 100.00% 1437.125 1 - main() 87.21% 1253.268 1 - ResourceLoader::respond 79.31% 1139.756 1509 - ResourceLoaderModule::getVersionHash 77.88% 1119.292 3 - ResourceLoader::getCombinedVersion ..
Example using cURL:
# Production $ curl -H 'X-Wikimedia-Debug: 1; forceprofile' 'https://test.wikipedia.org/w/load.php?modules=startup&only=scripts&raw=1' # Beta Cluster $ curl -H 'X-Wikimedia-Debug: 1; forceprofile' 'https://en.wikipedia.beta.wmflabs.org/w/load.php?modules=startup&only=scripts&raw=1'
For maintenance scripts, this can be triggered using the
--profiler=text option. For both web requests profiling and CLI profiling, the debug tools are only installed on mwdebug servers, so be sure to run mwscript from an mwdebug server (in Beta Cluster, any MW server will do).
mwdebug:~$ mwscript showJobs.php --wiki testwiki --profiler text 0 <!-- 100.00% 282.967 1 - main() 89.56% 253.419 1 - section.Setup.php 57.12% 161.627 139 - AutoLoader::autoload 49.59% 140.324 1 - ExtensionRegistry::loadFromQueue ..
log attribute in the X-Wikimedia-Debug header (“Verbose log” checkbox in the extension) will cause MediaWiki to be maximally verbose, recording all log messages on all channels (regardless of whether or not they are otherwise enabled in wmf-config).
This feature requires a WMF or Volunteer NDA Wikimedia developer account.
These messages will end up in Logstash at https://logstash.wikimedia.org/app/dashboards#/view/mwdebug and in
/srv/mw-log/XWikimediaDebug.log on mwlog1002 (mwlog2002 during data center switch). See Logs#mw-log for more information.
To view the logs of a specific web request only, the browser extension adds a "Find in Logstash" link in the footer on any wiki as well, which opens goes to Logstash with a filter for the request ID of the current page view. You can also construct this URL manually, by navigating to https://logstash.wikimedia.org/app/dashboards#/view/x-debug and entering a search query like
This section will walk you through a common WikimediaDebug debugging workflow.
It worked on your machine. It worked on the Beta Cluster. And the unit tests pass. But will it work in production? You can reduce the risk of unexpected breakage by deploying your change to a debugging server in production and using WikimediaDebug to verify its behavior.
Follow the instructions on How to deploy code, but stop after you have completed step 2 ("Get the code on the deployment host"). Now sync your change to one of the debug backends in a new tap by SSHing into it and running
you@laptop:~$ ssh mwdebug1001.eqiad.wmnet you@mwdebug1001:~$ scap pull
Now, enable the WikimediaDebug extension in your browser, and select the same backend you just pulled your code to. Your browser requests will be routed to this backend, allowing you to verify that your changes are working correctly prior to deployment.
X-Wikimedia-Debug does not currently work on the Wikitech wiki, which is served by servers separated from the rest of the Wikimedia Foundation production cluster. Try it on Wikipedia instead!
If it appears that your request are not producing any results, check if XHProf/tideways is properly installed and called:
- verify that
php-tideways-xhprofis installed on the target host
- ensure that the extension is loaded, i.e. look for
- check if
auto_prepend_fileis defined in
- Maintained by Performance Team
- Live chat (IRC): #wikimedia-perf connect
- Issue tracker: Phabricator (Report an issue)
Documented in README.md.
- WikimediaDebug v2 blogpost
- Debugging in production: Info on how to push code to a debug backend and for testing non-HTTP code (e.g. maintenance scripts)
- Performance/Runbook/WikimediaDebug: Internal guide for publishing releases.