Jump to content

Web Perf Hero award

From Wikitech

The Web Perf Hero award is given to individuals who have gone above and beyond to improve the web performance of Wikimedia projects.

It is awarded once a quarter (or less), and takes the form of a Phabricator badge. The initiative was launched by Wikimedia Performance in 2020.

Below are past recipients along what the award is in recognition of. Beyond specific the project that led to the award, these individuals have demonstrated repeated care, focus, and discipline around performance.

2023 (Q4)

Måté Szabó

MĂĄtĂ© (@TK-999) worked on improving MediaWiki’s autoloading mechanism at 853972 (task: T274041) specifically for classes which have been namespaced, by including them in the classmap. This means our autoloader will not spend significant amounts of time searching for files in directories, but rather just do a direct lookup. This approach is particularly optimized for speed and as a result, Wikipedia’s backend response time gained a 20% increase in the portion of requests that complete within 50ms.

Read more at https://techblog.wikimedia.org/2024/01/16/web-perf-hero-mate-szabo/


ValentĂ­n Gutierrez

ValentĂ­n (@Vgutierrez) improved ATS backend p75 latency by 25%, and reduced ATS p999 disk read latency by up to 1000X.

Read more at https://techblog.wikimedia.org/2022/11/21/web-perf-hero-valentin-gutierrez/

Amir Sarabadani

Over the past six months, Amir (@Ladsgroup) significantly reduced the processing time and cost for saving edits in MediaWiki. Not just once, but several times! We measure this processing time through Backend Save Timing (docs). This metric encompasses time spent on the web server, from process start, until the response is complete and flushed to the client.

Amir expanded MediaWiki's ContentHandler component, with an ability for content models to opt-out from eagerly generating HTML (T285987). On Wikipedia we generate HTML while saving an edit. This is necessary because HTML is central to how wikitext is parsed and, generating HTML ahead of time speeds up pageviews. On Wikidata, this is not the case. Wikidata entities (example) can be validated and stored without rendering an HTML page. Wikidata is also characterised by having a majority of edits come from bots, and the site receives far fewer pageviews proportional to its edits (where Wikipedia has ~1000 pageviews per edit,[1] Wikidata has ~10[2]). This does not account for Wikidata edits generally being done in sessions of several micro edits.

Amir adopted this new opt-out in the Wikibase extension, which powers Wikidata. This lets Wikidata skip the HTML generation step whenever possible. He also identified and fixed an issue with the SpamBlacklist extension (T288639), that prevented the Wikidata optimisation from working. The spam filter acts on links in the content via Parser metadata, but it requested a full ParserOutput object with HTML, rather than metadata.

Amir's work cut latencies by half. The wbeditentity API went from upwards of 1.5s at the 95th percentile to under 0.7s, and the 75th percentile from 0.6-1.0s down to 0.4-0.5s (Grafana). Internal metrics show where this difference originates. The EditEntity.attemptSave.p95 metric went from 0.5-0.7s down to 0.2-0.3s, and EditEntity.EditFilterHookRunner.avg from 0.2-0.3s to consistently under 0.1s (Grafana).



@SD0001 implemented Package files for Gadgets (T198758). This enables gadget maintainers to bundle JSON files, unpacked via require(). This improves performance by avoiding delays from extra web requests. It also improves security by allowing safe contributions to JSON pages, as pure data with validated syntax on-edit. Previously, admins on Wikimedia wikis for example, would need script editing access for this and rely on copy-paste instructions from another person via the talk page.

SD0001 also introduced Module::getSkins in ResourceLoader, and used it in the startup module to optimise away unneeded module registrations. We just shipped the first adoption of this for Gadgets (T236603). In the future, we'll use this to optimise MediaWiki's own skin modules as well.


@Umherirrender has initiated and carried out significant improvements to the performance of MediaWiki user preferences (T278650, T58633 , and T291748). The impact is felt widely and throughout Wikimedia sites. For example, when switching languages via the ULS selector, or exploring Beta Features and Gadgets, or switching skins. These are all powered by the MediaWiki "Preferences" component.

The work included implementing support for deferred message parsing in more HTMLForm classes, and applying this to the Echo and Gadgets extensions. This cut API latency by over 50%, from 0.7s to 0.3s at the median, and 1.2s to 0.5s at p95. (See graphs at T278650#7130951).

Kunal Mehta

@Kunal's work investigating and fixing performance differences during the Debian Buster upgrade was critical in understanding and mitigating the performance impact of that migration. If it wasn't for his initiative, that issue might have gone unnoticed or underestimated for some time and been much harder to understand and deal with.

Giuseppe Lavagetto

@Giuseppe's in-depth blog post about Envoy and PHP and all the underlying work that he did shows that he's willing to go the extra mile to improve the performance of our systems.


Nick Ray

Nick's in-depth analysis of the DOM order impact on performance was excellent and shows that how much work he does to ensure that he's building performant features.

Jon Robson

We hereby recognise the excellence of Jon's work converting image lazy loading to use IntersectionObserver, one of many projects he had the initiative of starting to improve the performance of our sites.


  1. ↑ "9 billion pageviews, 5 million edits" — Wikimedia Stats: en.wikipedia.org, April 2022.
  2. ↑ "500 million pageviews, 20 million edits" – Wikimedia Stats: wikidata.org, April 2022.