gecko-dev/docs/performance/perfstats.md

1.6 KiB

PerfStats

PerfStats is a framework for the low-overhead selective collection of internal performance metrics. The results are accessible through ChromeUtils, Browsertime output, and in select performance tests.

Adding a new PerfStat

Define the new PerfStat by adding it to this list in PerfStats.h. Then, in C++ code, wrap execution in an RAII object, e.g.

PerfStats::AutoMetricRecording<PerfStats::Metric::MyMetric>()

or call the following function manually:

PerfStats::RecordMeasurement(PerfStats::Metric::MyMetric, Start, End)

For incrementing counters, use the following:

PerfStats::RecordMeasurementCount(PerfStats::Metric::MyMetric, incrementCount)

Here's an example of a patch where a new PerfStat was added and used.

Enabling collection

To enable collection, use ChromeUtils.SetPerfStatsCollectionMask(MetricMask mask), where mask=0 disables all metrics and mask=0xFFFFFFFF enables all of them. MetricMask is a bitmask based on Metric, i.e. Metric::LayerBuilding (2) is synonymous to 1 << 2 in MetricMask.

Accessing results

Results can be accessed with ChromeUtils.CollectPerfStats(). The Browsertime test framework will sum results across processes and report them in its output. The raptor-browsertime Windows essential pageload tests also collect all PerfStats.