mirror of
https://github.com/mozilla/gecko-dev.git
synced 2024-10-22 09:45:41 +00:00
90a55d388c
There are a number of issues with the current gradient dithering implementation, that cause many test failures and also fuzziness rendering when enabling DirectComposition virtual surfaces. In particular, the dither result is dependent on the offset of the update rect within a render target. For now, this patch disables gradient dithering by default. This gives us: - A heap of new test PASS results (or reduced fuzziness). - Fixes a number of non-deterministic fuzziness bugs with DC. - Improves performance of gradient rendering by a reasonable amount. We can fix gradient dithering as a follow up, and re-enable if/when we find content that would benefit from it significantly (we may be able to improve gradients in other ways than dithering too). Differential Revision: https://phabricator.services.mozilla.com/D60460 --HG-- extra : moz-landing-system : lando |
||
---|---|---|
.. | ||
benchmarks | ||
examples | ||
reftests | ||
res | ||
script | ||
src | ||
.gitignore | ||
android.txt | ||
build.rs | ||
Cargo.toml | ||
README.md |
wrench
wrench
is a tool for debugging webrender outside of a browser engine.
headless
wrench
has an optional headless mode for use in continuous integration. To run in headless mode, instead of using cargo run -- args
, use ./headless.py args
.
replay
and show
Binary recordings can be generated by webrender and replayed with wrench replay
. Enable binary recording in RendererOptions
.
RendererOptions {
...
recorder: Some(Box::new(BinaryRecorder::new("wr-frame.bin"))),
...
}
If you are working on gecko integration you can enable recording in webrender_bindings/src/bindings.rs
by setting
static ENABLE_RECORDING: bool = true;
wrench replay --save yaml
will convert the recording into frames described in yaml. Frames can then be replayed with wrench show
.
reftest
Wrench also has a reftest system for catching regressions.
- To run all reftests, run
script/headless.py reftest
- To run specific reftests, run
script/headless.py reftest path/to/test/or/dir
- To examine test failures, use the reftest analyzer
- To add a new reftest, create an example frame and a reference frame in
reftests/
and then add an entry toreftests/reftest.list