gecko-dev/tools/performance/layout/perf-doc.html
timeless%mozdev.org eb9350f619 Bug 106386 Correct misspellings in source code
patch by unknown@simplemachines.org r=timeless rs=brendan
2005-11-25 19:48:04 +00:00

278 lines
10 KiB
HTML

<!doctype html public "-//w3c//dtd html 4.0 transitional//en">
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1">
<meta name="Author" content="Marc Attinasi">
<meta name="GENERATOR" content="Mozilla/4.7 [en] (WinNT; U) [Netscape]">
<title>Performance Tools for Gecko</title>
<style>
BODY { margin: 1em 2em 1em 2em; background-color: bisque }
H1, H2, H3 { background-color: black; color: bisque; }
TABLE.boxed { border-width: 1; border-style: dotted; }
</style>
</head>
<body>
<dl>&nbsp;
<table WIDTH="100%" >
<tr>
<td>
<center><img SRC="mozilla-banner.gif" height=58 width=600></center>
</td>
</tr>
</table>
<center><table COLS=1 WIDTH="80%" class="boxed" >
<tr>
<td>
<center>
<h2>
Performance Monitoring for Gecko</h2></center>
<center>
<dd>
<i>maintainer:&nbsp; marc attinasi&nbsp;</i></dd></center>
<center>
<dd>
<i><a href="mailto:attinasi@netscape.com">attinasi@netscape.com</a></i></dd></center>
</td>
</tr>
</table></center>
<center>
<dd>
</dd></center>
</dl>
<h3>
Brief Overview</h3>
Gecko should be <i>fast</i>. To help us make sure that it is we monitor
the performance of the system, specifically in terms of Parsing, Content
Creation, Frame Creation and Style Resolution - the core aspects of layout.
Facilitating the monitoring of performance across build cycles is a small
set of tools that work in conjunction with program output coming from the
Mozilla or Viewer applications to produce tables of performance values
and historical comparisons of builds analysed in the past. The tools, their
dependencies and their general care and feeding are the topics of this
document.
<h4>
Usage: A five-step plan to enlightenment</h4>
<ul>
<li>
First, the tools are all designed to run only on Windows. That is really
a bummer, but since most of what we are measuring is XP it should not really
matter. Get a Windows NT machine if you want to run the tools.</li>
<li>
Next, you need a build that was created with performance monitoring enabled.
To create such a build you must compile the Mozilla source with a special
environment variable set. This environment variable turns on code that
accumulates and dumps performance metrics data. The environment variable
is: <b>MOZ_PERF=1</b>. Set this environment variable and then build all
of Mozilla. If you can obtain a build that was built with MOZ_PERF=1 set
then you can just use that build.</li>
<li>
Third, run the script <b>perf.pl</b> to execute Viewer and run through
the test sites gathering performance data.</li>
<li>
Fourth, make sure the script completed and then open the resultant HTML
files which is dropped in the Tables subdirectory.</li>
<li>
Lasty, stare at the table and the values in it and decide if performance
is geting better, worse, or staying the same.</li>
</ul>
<h3>
The PerfTools</h3>
<blink>IMPORTANT: </blink>The tools created for monitoring performance
are very tightly coupled to output from the layout engine. As Viewer (or
Mozilla) is run it spits-out various timing values to the console. These
values are captured to files, parsed and assembled into HTML tables showing
the amount of CPU time dedicated to parsing the document, creating the
content model, building the frame model, and resolving style during the
building of the frame model. All of the scripts that make up the perftool
are locate in the directory <tt>\mozilla\tools\performance\layout.</tt>
Running them from another location <i>may</i> work, but it is best to run
from there.
<p>The perl script, <tt>perf.pl</tt>, is used to invoke Viewer and direct
it to load various URLs. The URLs to load are contained in a text file,
on per line. The file <tt>40-URL.txt</tt> is the baseline file and contains
a listing of file-URLs that are static, meaning they never change, because
they are snapshots of popular sites. As the script executes it does two
things:
<ol>
<li>
Invoke Viewer and feed it the URL-file, capturing the output to another
file</li>
<li>
Invoke other perl scripts to process the Viewer output into HTML tables</li>
</ol>
A set of perl scripts are used to parse the output of the Viewer application.
These scripts expect the format of the performance data to be intransient,
in other words, it should not change or the scripts need to be updated.
Here are the files involved in parsing the data and generating the HTML
table:
<ul>
<li>
<tt><b>perf.pl</b> : </tt>The main script that orchestrates the running
of viewer and the invocation of other scripts, and finally copies files
to their correct final locations. An example of an invocation of the perf.pl
script is: '<b><tt><font color="#000000">perl perf.pl</font><font color="#000099">
Daily-0215 s:\mozilla\0215 cpu</font><font color="#000000">'</font></tt></b></li>
<ul>
<li>
<tt><b><font color="#000099">Daily-0215 </font></b><font color="#000000">is
the name of the build and can be anything you like.</font></tt></li>
<li>
<tt><b><font color="#000099">s:\mozilla\0215 </font></b><font color="#000000">is
the location of the build. There must be a bin directory under the directory
you specified, and it must contain the MOZ_PERF enabled build.</font></tt></li>
<li>
<tt><b><font color="#000099">cpu </font></b><font color="#000000">indicates
that we are timing CPU time. The other option is clock but that is not
currently functional because of the clock resolution.</font></tt></li>
</ul>
<li>
<b><tt>Header.pl</tt></b> : a simple script that generates the initial
portion of the HTML file that will show the performance data for the current
build.</li>
<li>
<tt><b>AverageTable2.pl</b> </tt>: a slightly more complicated script that
parses the output from Viewer, accumulates data for averaging, and generates
a row in the HTML table initialized by header.pl. This file is <b>must</b>
be modified if the performance data output fromat changes.</li>
<li>
<tt><b>Footer.pl</b> </tt>: a simple script that inserts the last row in
the HTML table, the averages row. It also terminates the table and ends
the HTML tag.</li>
<li>
<tt><b>GenFromLogs.pl</b> </tt>: a script that generates the HTML table
from already existing logs. This is used to regenerate a table after the
QA Partner script has run, in case the table file is lost or otherwise
needs to be recreated. Also, if old logs are kept, they can be used to
regenerate their corresponding table.</li>
<li>
<b><tt>Uncombine.pl</tt></b> : a script that breaks up a single text file
containing all of the timing data for all of the sites into a separate
file for each individual site.</li>
<li>
<b><tt>History.pl</tt></b> : a script that generates an HTML file showing
historical comparison of average performance values for current and previous
builds.</li>
</ul>
<h3>
The URLs</h3>
It is critical that the URLs that we load while measuring performance do
not change. This is because we want to compare performance characteristics
across builds, and if the URLs changed we could not really make valid comparisons.
Also, as URLs change, they exercise different parts of the application,
so we really want a consistent set or pages to measure performance against.
The builds change, the pages do not.
<p>On February 3, 2000 the top 40 sites were 'snaked' using the tool WebSnake.
These sites now reside in disk-files and are loaded from those files during
the load test. The file <tt>40-URL.txt</tt> contains a listing of the file-URLs
created from the web sites. The original web sites should be obvious from
the file-URLs.
<br>&nbsp;
<blockquote><i><b>NOTE</b>: There are some links to external images in
the local websites. These should have been resolved by WebSnake but were
not for some reason. These should be made local at some point so we can
run without a connection to the internet.</i></blockquote>
<h3>
Historical Data and Trending</h3>
Historical data will be gathered and presented to make it easy for those
concerned to see how the relative performance of various parts of the product
change over time. This historical data is kept in a flat file of comma-delimited
values where each record is indexed by the pull-date/milestone and buildID
(note that the buildID is not always reliable, however the pull-date/milestone
is provided by the user when the performance package is run, so it can
be made to be unique). The Historical Data and Trending table will show
the averages for Parsing, Content Creation, Frame Creation, Style Resolution,
Reflow, Total Layout and Total Page Load time for each build, along with
a simple bar graph representation of each records weight relative to the
other records in the table. At a later this can be extended to trend individual
sites, however for most purposes the roll-up of overall averages is sufficient
to track the performance trends of the engine.
<h3>
The Execution Plan</h3>
Performance monitoring will be run on a weekly basis, and against all Milestone
builds. The results of the runs will be published for all interested parties
to see. Interested and/or responsible individuals will review the performance
data to raise or lower developer awareness of performance problems and
issues as they arise.
<p>Currently, the results are published weekly at <a href="http://techno/users/attinasi/publish">http://techno/users/attinasi/publish</a>
<h3>
Revision Control and Archiving</h3>
The scripts are checked into cvs in the directory \mozilla\tools\performance\layout.
The history.txt file is also checked in to cvs after every run, as are
the tables produced by the run. Commiting the files to cvs is a manual
operation and should be completed only when the data has been analysed
and appears valid. Be sure to:
<ol>
<li>
Commit history.txt after each successful run.</li>
<li>
Add / commit the new table and new trend-table after each successful run
(in the Tables subdirectory).</li>
<li>
Commit any chages to the sciripts or this document.</li>
</ol>
<hr WIDTH="100%">
<h3>
History:</h3>
<table BORDER WIDTH="50%" >
<tr>
<td WIDTH="25%">02/04/2000</td>
<td>Created - attinasi</td>
</tr>
<tr>
<td>03/17/2000</td>
<td>Removed QA Partner stuff - no longer used</td>
</tr>
<tr>
<td></td>
<td></td>
</tr>
<tr>
<td></td>
<td></td>
</tr>
<tr>
<td></td>
<td></td>
</tr>
</table>
</body>
</html>