GTest (googletest) is Google's framework for writing C++ tests on a variety of platforms (Linux, Mac OS X, Windows, Cygwin, Windows CE, and Symbian). Based on the xUnit architecture, it supports automatic test discovery, a rich set of assertions, user-defined assertions, death tests, fatal and non-fatal failures, value- and type-parameterized tests, various options for running the tests, and XML test report generation.
Integration
GTest is run as part of the build job on Win/Mac/Linux. Not yet supported on mobile (although its been executed manually there before)! The tests are currently executing part of the 'make check' phase on integration. The results will be show under the B(uild) job. Search for logs for 'gtest' to get your results.
Running tests
The Mozilla build process will build GTest on supported platforms as long as you don't disable test in your mozconfig. However xul-gtest will only be built when tests are required to save an expensive second linking process.
To run the unit tests use 'mach gtest' when invoking Gecko.
Running selected tests
Tests can be selected using mach. You can also use environment variables support by GTest. See Running Test Programs: Running_a_Subset_of_the_Tests for more details.
mach gtest Moz2D.*
Configuring GTest
GTest can be controlled from other environment variables. See Running Test Programs: Advanced Options for more details.
Debugging a GTest Unit Test
To debug a gtest, pass --debug to the normal command.
./mach gtest --debug [ Prefix.Test ]
If that doesn't work, you can try runing the firefox binary under the debugger with the MOZ_RUN_GTEST environment variable set to 1.
MOZ_RUN_GTEST=1 ./mach run --debug [--debugger gdb]
Don't forget to build + run 'mach gtest' to relink when using MOZ_RUN_GTEST since it's not part of a top level build.
Note that this will load an alternate libxul - the one which has the test code built in, which resides in a gtest/subdirectory of your objdir. This gtest-enabled libxul is not built as part of the regular build, so you must ensure that it is built before running the above command. A simple way to do this is to just run "mach gtest" which will rebuild libxul and run the tests. You can also extract the commands needed to just rebuild that libxul from mach and run those directly. Finally, note that you may have to run through the tests once for gdb to load all the relevant libraries and for breakpoint symbols to resolve properly.
Note that you can debug a subset of the tests (including a single test) by using the GTEST_FILTER environment variable:
GTEST_FILTER='AsyncPanZoom*' MOZ_RUN_GTEST=1 ./mach run --debug [--debugger gdb]
Writing a GTest Unit Test
Most of the GTest documentation will apply here. The GTest primer is a recommended read. For more of a "quick-start" documentation, see the GTest ForDummies page.
GTest will run tests in parellel. Don't add unit tests that are not threadsafe, such as tests that require focus or use specific sockets.
GTest will run without initializing mozilla services. Initialize and tear down any depenencies you have in your test fixtures. Avoid writing integration tests and focus on testing individual units.
See https://hg.mozilla.org/mozilla-central/rev/351462147f91 for an example of how to add a simple test.
If you're converting an existing C++ unit test to a GTest, this commit may serve as a useful reference.
Adding a test to the build system
Find a gtest directory appropriate for the module. If none exist create a directory using the following convention: '<submodule>/tests/gtest'. Create a moz.build file (in the newly created directory) with a module declaration, replacing gfxtest with a unique name, and set UNIFIED_SOURCES to contain all of the test file names.
What we're doing here is creating a list of source files that will be compiled and linked only against the gtest version of libxul. This will let these source files call internal xul symbols without making them part of the binary we ship to users.
# -*- Mode: python; c-basic-offset: 4; indent-tabs-mode: nil; tab-width: 40 -*- # vim: set filetype=python: # This Source Code Form is subject to the terms of the Mozilla Public # License, v. 2.0. If a copy of the MPL was not distributed with this # file, you can obtain one at https://mozilla.org/MPL/2.0/. Library('gfxtest') UNIFIED_SOURCES = [ <ListTestFiles>, ] FINAL_LIBRARY = 'xul-gtest'
Update '<submodule>/moz.build' in the parent directory to build your new subdirectory in :
if CONFIG['ENABLE_TESTS']: DIRS += ['tests/gtest']
When adding tests to an existing moz.build file (it has FINAL_LIBRARY = 'xul-gtest'), add the following. That's it--there is no test manifest required. Your tests will be automatically registered using a static constructor.
UNIFIED_SOURCES = [ 'TestFoo.cpp', ]
Notes
The include file for the class you are testing may not need to be globally exported, but it does need to be made available to the unit test you are writing. In that case, add something like this to the Makefile.in inside of the testing directory.
LOCAL_INCLUDES += [ '/gfx/2d', '/gfx/2d/unittest', '/gfx/layers', ]
Gtests currently run from the test package under the GTest symbol on Treeherder if you want to verify that your test is working. Formerly they were run under the B symbol, during `make check`.
MozGTestBench
A Mozilla GTest Microbench is just a GTest that reports the test duration to perfherder. It's an easy way to add low level performance test. Keep in mind that there's a non-zero cost to monitoring performance test so use them sparingly. You can still perform test assertions.
Writing a Microbench GTest
Use 'MOZ_GTEST_BENCH' instead of 'TEST' to time the execution of your test. Example:
#include "gtest/MozGTestBench.h" // For MOZ_GTEST_BENCH ... MOZ_GTEST_BENCH(GfxBench, TEST_NAME, []{ // Test to time the execution });
Make sure this file is registered with the file system using the instructions above. If everything worked correctly you should see this in the GTest log for your corresponding test:
PERFHERDER_DATA: {"framework": {"name": "platform_microbench"}, "suites": [{"name": "GfxBench", "subtests": [{"name": "CompositorSimpleTree", "value": 252674, "lowerIsBetter": true}]}]}
Sheriffing policy
Microbench tests measure the speed of a very specific operation. A regression in a micro-benchmark may not lead to a user visible regression and should not be treated as strictly as a Talos regression. Large changes in microbench scores will also be expected when the code is directly modified and should be accepted if the developer intended to change that code. Micro-benchmarks however provide a framework for adding performance tests for platform code and regression tests for performance fixes. They will catch unintended regressions in code and when correlated with a Talos regression might indicate the source of the regression.