Jump to content

User:BKing (WMF)/Notes/Benchmarking Criteria

From Wikitech

Brendan Gregg's recommendations for benchmarking as a checklist (BGRfBaaC)

The following is taken from chapter 12 of "Systems Performance: Enterprise and the Cloud" Second Edition by Brendan Gregg.

12.1.1 Why am I Benchmarking? (select one or more)

  • System design
  • Proofs of concept
  • Tuning
  • Development
  • Capacity Planning
  • Troubleshooting

12.1.2 How can I ensure my benchmarks are: ?

  • Repeatable
  • Observable
  • Portable
  • Coherent/Cogent
  • Realistic
  • Runnable

12.1.2A Benchmark Analysis

  • What is being tested?
  • What are the limiting factor or factors?
  • What transient issues might affect the results (Puppet run, noisy neighbor, etc)?
  • What conclusions can we draw from the results?

12.2 What benchmark types will I use?

12.2.1 Micro-Benchmarking

12.2.2 Simulation

12.2.3 Replay

12.3 What Methodologies will I use?

12.3.2 Active benchmarking

12.3.4 USE method

12.3.5 Workload Characterization

12.3.7 Ramping load

12.3.9 Statistical analysis

12.3.10 Benchmarking Checklist

  • Why weren't the results twice as good (in other words, what was the limiting factor)?
  • Did it break limits?
  • Did it err?
  • Is it reproducable?
  • Does it matter?
  • Did anything actually happen (for example, the benchmark finished quickly because it couldn't actually login due to firewall rules)?