Performance/User Journey

From Wikitech

User journeys are performance measurements that span multiple interactions and/or browser navigations.

Historically, the Performance Team has primarily measured the performance of single page views (e.g cold cache first views, or repeat views) where the device visit a particular Wikipedia page. This has worked well and results in very stable metrics. Testing multiple page loads together, results in less stable metrics, but it can still be valuable, as it gives us a more complete picture.

We measure user journeys using Browsertime/sitespeed.io scripting, that way we have a controlled way of creating scripts and getting the data the same way as we do with the tests we run with WebPageReplay and WebPageTest.

You can find our user journeys in the drill down page in Grafana, make sure that the Test type starts with userJourney.

User Journeys

The user journeys we measure today.

Desktop

Go to all test scripts.

Measure pages as a logged in user

We have one user journey where we log in a user, measure the login step, and then access three pages (as a logged in user). The journey looks like this: https://en.wikipedia.org/w/index.php?title=Special:UserLogin&returnto=Main+Page -> login the user -> https://en.wikipedia.org/wiki/Barack_Obama -> https://en.wikipedia.org/wiki/Facebook -> https://en.wikipedia.org/wiki/Sweden.

The metrics are collected under the scenario loginDesktop.

Search

We measure three different kind of searches on desktop:

  • Search for Obama Wikipedia at Google and click on the search result on the Google page and measure loading the Obama page.
  • Go to the main page and search for Barack Obama in the header, and measure the time from search button until the result page is shown.
  • Go to the Special:Search page and search for Barack Obama and measure the time from search button until the result page is shown.

The metrics are collected under the scenario search with the page names googleObama, headerSearchObama and searchPageObama.

Second view page views

Most of our testing focus on testing accessing Wikipedia with an empty browser cache. The second page view scenario we measure the next page view, when we already have filled the browser cache with some assets.

We have three tests:

The metrics are collected under the scenario secondViewDesktop.

Emulated mobile

Go to all test scripts.

Measure pages as a logged in user

We have one user journey where we log in a user, measure the login step, and then access three pages (as a logged in user). The journey looks like this: https://en.m.wikipedia.org/w/index.php?title=Special:UserLogin&returnto=Main+Page -> login the user -> https://en.m.wikipedia.org/wiki/Barack_Obama -> https://en.m.wikipedia.org/wiki/Facebook -> https://en.m.wikipedia.org/wiki/Sweden.

The metrics are collected under the scenario loginEmulatedMobile.

Search

We measure two different kind of searches for mobile:

  • Go to Google, search for Obama Wikipedia and click on the search result for the Obama page and measure that page
  • Go to the main page and search for Barack Obama and measure the time to load the Barack Obama page.

The metrics are collected under the scenario searchEmulatedMobile with the page names googleObama and searchPageObama.

Second view page views

he second page view scenario we measure the next page view, when we already have filled the browser cache with some assets.

We have two tests:

The metrics are collected under the scenario secondViewEmulatedMobile.

Add your own user journey

All user journeys use the sitespeed.io scripting.

Background

All configuration files exists in our synthetic monitoring tests repo. Clone the repo and go into the tests folder:

git clone ssh://USERNAME@gerrit.wikimedia.org:29418/performance/synthetic-monitoring-tests.git
cd synthetic-monitoring-tests/tests

All test files lives in that directory. User journey tests exists in the sitespeedio directories named scripts (they consist of script files instead of just plain text with the URLS):

  • desktop/scripts - the scripts that runs with desktop settings, testing the desktop version of Wikipedia.
  • emulatedMobile/scripts - the scripts that runs the emulated mobile settings, testing the mobile version of Wikipedia.

The file structure looks something like this:

.
├── sitespeedio
│   ├── desktop
│   │   ├── scripts
│   │   │   ├── loginDesktop.js
│   │   │   ├── saveTiming.enwiki.js
│   │   │   ├── search.googleObama.js
│   │   │   ├── search.headerObama.js
│   │   │   ├── search.pageObama.js
│   │   │   ├── secondViewDesktop.elizabeth.js
│   │   │   ├── secondViewDesktop.facebook.js
│   │   │   └── secondViewDesktop.mainpage.js
│   │   └── urls
│   │       ├── alexaTop10.txt
│   │       └── desktop.txt
│   └── emulatedMobile
│       ├── scripts
│       │   ├── loginEmulatedMobile.js
│       │   ├── searchEmulatedMobile.googleObama.js
│       │   ├── searchEmulatedMobile.pageObama.js
│       │   ├── secondViewEmulatedMobile.elizabeth.js
│       │   └── secondViewEmulatedMobile.facebook.js
│       └── urls
│           ├── alexaMobileTop10.txt
│           └── emulatedMobile.txt

The first part of the filename (before the first dot) is used as a key of the namespace in Graphite. Use that to group the same tests under the same group. For example on mobile we do two different search tests. One searching at Google and then measuring the Wikipedia load time and searching with our internal search. Those files are named:

  • searchEmulatedMobile.googleObama.js
  • searchEmulatedMobile.pageObama.js

searchEmulatedMobile is then the namespace in Graphite. That key of the namespace is used as the scenario name. Checkout the User Journey dashboard and change the scenario dropdown.

Scenario dropdown

The user journeys use sitespeed.io scripting to navigate and test the journey. The search test on mobile looks something like this:

module.exports = async function ( context, commands ) {
	commands.meta.setTitle( 'Test coming from Google search' );
	commands.meta.setDescription( 'Search for Obama Wikipedia and click on the search result for the Obama page and measure that page' );
	await commands.navigate( 'https://www.google.com' );
	await commands.addText.byName( 'Obama Wikipedia', 'q' );
	await commands.wait.byTime( 2000 );
	await commands.click.byClassNameAndWait( 'Tg7LZd' );
	// Hide the content to avoid picking up the click as first visual change
	await commands.js.run( 'for (let node of document.body.childNodes) { if (node.style) node.style.display = "none";}' );
	await commands.measure.start( 'googleObama' );
	await commands.click.byXpathAndWait( "//a[@href='https://en.m.wikipedia.org/wiki/Barack_Obama']" );
	return commands.measure.stop();

Add

  • Create your new file in the correct directory (desktop or emulated mobile).
  • Add the scripting to your file.
  • Test the script. If you have a desktop test, verify that it works for both Firefox and Chrome.
  • Commit the new script and get someone to review your change.
  • When approved, the script automatically run the next iteration on the test agent and send the metrics to Graphite.
  • If you add a new kind of user journey, document it on this page.