A file generation target usually corresponds to a set of templates to generate files from. A target may also provide convenience API to use in the tests and, if necessary, auxiliary files, for example, various scripts that automate generation of test source files and catalogues of requirements in a target-specific format or simplify the analysis of test results, etc.
Custom file generation targets can be provided by the users of T2C. If a set of template groups is prepared in appropriate format and directory structure (see Section 5, “Templates, Template Groups and File Generation with T2C”), they can be used by the file generator in exactly the same way it uses c_* targets provided with it (c_standalone, c_minimal, etc.). -d
(--templates
) option of t2c can be used to specify the main template directory. -t
(--target
) option or TARGET
parameter in the configuration file for the suite allows to choose the file generation target of interest (see also Section 3, “T2C Command Line Tool”).
When developing a new T2C target it could be helpful to look at the templates, API and auxiliary files for the existing targets like c_standalone, c_gtest_nested, etc. The way these targets are organized may be not the most effective one. But it works and probably can be at least partially applied to the new targets too.
The name of a target may indicate which programming language, testing framework, etc., it is aimed at. For example, “c_tet” target is designed for C programming language and TETWare Lite test harness.
The name of the target may provide other information about in what conditions the target can be used. “c_gtest_nested” target, for example, is designed to develop tests in C programming language with Glib Testing Framework. The resulting tests can be used as a separate Autotools-style package, nested in the package under test (see this tutorial for detailed explanations and examples), hence “nested” in the name of the target.
In general, there are no particular requirements on the names of the file generation targets. It is only recommended that the names should be descriptive to give a hint for the users about what this target is intended for.
Note that c_tet, c_standalone, c_gtest_nested and c_minimal provided with T2C targets can be used for testing C++ libraries and applications as well. The generated tests can be compiled with a C++ compiler (at least g++ 3.4.2 or newer). What is needed in this case, it is just to override the default template of a per-test makefile replacing $(CC)
with $(CXX)
and removing unnecessary compiler options (like -std=c99
). In addition, you can specify the desired extension of the resulting source files in TEST_SOURCE_EXT
parameter in the configuration file of the test suite.
The tests generated using c_tet, c_standalone, c_gtest_nested and c_minimal targets are intended to be used on Linux operating systems. However, the file generation can be performed for these tests on any platform where T2C file generator and the corresponding templates are available.
A test developer is not required to use special API provided by T2C file generation targets in the tests. Sometimes API provided by the chosen testing framework is enough. Nevertheless, T2C API may prove handy, especially requirement-related facilities as well as API for message output.
Described here is C API provided by the following T2C file generation targets:
c_standalone
c_tet
c_gtest_nested
c_minimal (most of the API are just stubs to simplify debugging of the tests)
For the first three of the targets listed, API is declared in t2c_api.h
which is automatically #include'd in the resulting source file (the appropriate #include
directive comes from the main template of test_group group). For c_minimal target, API is declared and defined in the source file of the tests itself.
Other T2C targets are not required to provide the same API or even any API at all.
It is not recommended to use functions, constants and macros declared in t2c_api.h
in .t2c files except the public API described below. Some functions from t2c_api.h
are used in the templates and hence, appear in the generated source files. Still, if such function, macro or a constant is not the part of the public API listed here, do not use it directly in .t2c files. This can lead to a trouble when you try to generate another test suite from the same .t2c files using another T2C target this time. And it will almost definitely create problems if you try to use c_minimal target in this case to create the “minimal” sources of the tests for debugging: they may fail to compile.
Unless specifically stated, the API macros and functions described here are intended to be used in the parameterized test scenarios themselves, that is in <CODE> and <FINALLY> subsections of <BLOCK> sections in a .t2c file.
If TRUE
and FALSE
constants are not provided by the chosen testing framework or the system under tests, they are provided by T2C API.
FALSE
is defined as 0, TRUE
- as !FALSE
.
Synopsis:
INIT_FAILED(<message>
);
INIT_FAILED()
should be used in <STARTUP> section to indicate that the initialization has failed for some reason. <message>
should describe what happened (can be a constant string) - it will be output to the logging facility used by the tests (stderr
, a log file, etc.).
After INIT_FAILED()
is called, the instructions following it in the <STARTUP> section will not be executed. Execution of the tests from this .t2c file will be cancelled. The contents of <CLEANUP> section (if it is not empty) will be executed anyway.
Example:
mod = (FT_Module)FT_Get_Module(library, "otvalid"); if( mod == NULL ) { INIT_FAILED("OpenType validation module is not available."); }
You should use ABORT_TEST()
rather than INIT_FAILED
in <CODE> sections to indicate that the initialization instruction performed there have failed. INIT_FAILED()
can be used <STARTUP> section only.
Synopsis:
TRACE(<format>
, ...); TRACE0(<message>
);
TRACE()
outputs the formatted message to the logging facility used by the tests. This is like what printf()
does for stdout
except that TRACE()
should be passed at least one argument besides the format string <format>
.
<format>
is a format string of the same kind that printf()
uses.
Unlike printf()
, TRACE()
may end the message with a newline character (if applicable for the logging facility used).
TRACE0()
is like TRACE()
except it takes the only argument, the message to be output.
Examples:
TRACE("The element with index %d has value %d.", i, val[i]); TRACE0("The operation completed successfully.");
Synopsis:
REQ("<ID>
[; ...]",<message>
,<expression>
); REQva("<ID>
[; ...]",<expression>
,<format>
, ...);
REQ()
and REQva()
are used in the test scenarios (<CODE> section) to check requirements (assertions) and report failures. These macros differ only in the way they prepare a message to be output in case of a failure.
Both macros receive the list of identifiers of requirements checked there and an expression to check.
If <expression>
evaluates to nonzero, the check passes, that is, the requirements listed in the first argument of the macros seem to hold. In this case, REQ()
and REQva()
may report that the requirements were checked. Then execution of the test goes on.
If a REQ()
or REQva()
passes, it does not mean, of course, that the requirements listed there are now completely tested and need not be checked further. It just means that in this particular situation no violation of the requirements were detected.
If <expression>
evaluates to zero, it indicates a failure, namely a violation of at least one requrement listed in the first argument of the macro. In this case, REQ()
and REQva()
do the following.
REQ()
outputs the message specified in its second argument verbatim to the logging facility used. The message should describe what happened rather than just repeat the text of the corresponding requirements. The message can be left blank if necessary.
REQva()
creates and outputs the formatted message using the specified format string and the arguments after it. The effect is as if <format>
and all arguments following it were passed to TRACE()
.
Then both macros report that the specified requirement is violated (or at least one of the requirements if several of them are specified in the list). In addition, the text of the requirement(s) is loaded from the catalogue of requirements and also output to the logging facility (except in c_minimal target). After that, the rest of the code from <CODE> section is not executed but the code from <FINALLY> section is executed. It is similar to “fatal assertions” used in some testing frameworks. Test verdict is set to “FAIL” in this case.
Examples:
REQ("g_array_append_vals.02", "g_array_append_vals returned NULL", new_ga != NULL); REQva("g_array_append_vals.08", ga->len == len, "The length of the array is %d while it should be %d.", (int)ga->len, len);
It is possible to provide more than one REQ()/REQva()
for the same requirement (probably with different expressions to check) in a test scenario.
Ideally, a single requirement should be checked in each REQ()/REQva()
call. Sometimes, however, some of the requirements are tightly coupled and it can be difficult to tell exactly which one of them is violated if the checked expression evaluated to 0. This may happen, for example, in the tests for getter/setter functions and in more complex cases.
To handle this, the identifiers of the requirements involved should be listed in REQ()/REQva()
separated by semicolons (spaces are allowed before and after a semicolon), for example:
## {ext.g_array_new.06}: "The size of the array is 0 upon creation." ## {g_array_new.08}: "'len' is the length of the array." REQ("ext.g_array_new.06; g_array_new.08", "'len' field is not 0 upon creation of the array", ga->len == 0);
The requirements for the applications rather than for the system under test (i.e. those with app-prefixed IDs, app-requirements) can also be checked using REQ()/REQva()
. Such REQ()/REQva()
calls are usually placed somewhere in the beginning of the test scenario. Unlike the requirements for the system under test, violation of an app-requirement indicates an error in the test itself.
Using REQ()
for app-requirements should not be confused with calling ABORT_TEST()
. Both are often used before the requirements for the system under test are checked. They serve different purposes, however. A failure in REQ()/REQva()
for an app-requirement indicates an error in the test itself while ABORT_TEST()
means that there is something wrong in the system under test that did not allow to prepare test situation properly.
REQ()/REQva()
usually handle app-requirements in a different way than the ordinary requirements. They often set different test verdicts in these cases, for example, UNRESOLVED if an app-requirement is violated, FAIL if a requirement for the system under test is violated. The messages output by these macros can also be different.
App-requirements and the requirements for the system under test should not be combined in the same list of IDs in a REQ()/REQva()
call. It usually makes no sense anyway.
Requirements for the system under test with ext-prefixed IDs (ext-requirements) are not checked by default. To put it more exactly, REQ()/REQva()
calls for such requirements are considered passed by default. If there are several IDs of requrements listed in REQ()/REQva()
and at least one of them is an ext-requirement, this REQ()/REQva()
is considered passed.
To turn on checking of ext-requirements, CHECK_EXT_REQS
must be defined when compiling the tests. To do this, before generating the test sources from .t2c files, add -DCHECK_EXT_REQS
compiler option via a parameter in the configuration file for the suite or by adjusting the template of a per-test makefile.
A special expression TODO_REQ()
can be used in REQ()
(and REQva()
) in place of <expression>
argument to mark this REQ()
as not yet implemented. Such REQ()/REQva()
calls are ignored (no-ops) when the tests are executed.
Synopsis:
RETURN;
RETURN
ends the test without changing its the verdict (result code). The verdict is PASS by default and can be changed, for example, by REQ()
, ABORT_TEST()
and other macros.
If RETURN
, TEST_FAILED()
or ABORT_*()
macros are used in <CODE< section, the contents of the corresponding <FINALLY< section will be executed anyway. These macros can be used in <FINALLY< section too. In this case, the code in this section is not executed again, the execution of the test stops as it should.
Synopsis:
ABORT_TEST(<message>
);
This macro should be used in a test to abort its execution if something wrong happens: a failure to prepare test situation or the means to analyse the resulting state of the system under test, etc. <message>
should describe the situation (can be a constant string).
Example:
ga = g_array_new(FALSE, TRUE, sizeof(TYPE)); if (ga == NULL) { ABORT_TEST("g_array_new() returned NULL."); }
This macro may set the verdict of the test to UNRESOLVED (at least for some T2C targets) unless it has been already set to FAIL.
For compatibility with earier versions of API, ABORT_TEST_PURPOSE()
macro is available, with the same meaning and syntax as ABORT_TEST()
.
Synopsis:
ABORT_UNSUPPORTED(<message>
);
This macro should be used in a test to abort its execution if an optional feature to be checked is not supported by the system under test. <message>
should describe the situation (can be a constant string).
Example:
if (!is_mode_supported(2048, 1536, 64)) { ABORT_UNSUPPORTED("Optional video mode 2048x1536 (64bpp) is not supported"); }
This macro may set the verdict of the test to UNSUPPORTED (at least for some T2C targets) unless it has been already set to FAIL, UNRESOLVED, UNTESTED or FIP.
In practice, a test that sets UNSUPPORTED verdict is considered passed in some test suites.
Synopsis:
ABORT_UNTESTED(<message>
);
This macro should be used in a test to abort its execution if an mandatory feature to be checked is not supported by the system under test. <message>
should describe the situation (can be a constant string).
Example:
if (check_dll_support() == DLL_NOT_SUPPORTED) { ABORT_UNTESTED("Dynamically loaded libraries are not supported. " "Aborting the test."); }
This macro may set the verdict of the test to UNTESTED (at least for some T2C targets) unless it has been already set to FAIL, UNRESOLVED, UNSUPPORTED or FIP.
Synopsis:
ABORT_FIP(<message>
);
This macro should be used in a test to abort its execution and indicate that the test has FIP ("Further Information Provided") status. It means that more information is necesary to get about the system under test to decide whether the test passed or not. <message>
should describe the situation and, preferably, specify which additional data is necessary (<message>
can be a constant string).
Example:
if (!have_a_clue) { ABORT_FIP("To determine if the test passed, the following " "additional information is needed: ..."); }
This macro may set the verdict of the test to FIP (at least for some T2C targets) unless it has been already set to FAIL or UNRESOLVED.
Synopsis:
TEST_FAILED(<message>
);
This macro aborts test execution and sets the result of the test to FAIL. Most of the time, you do not need this macro because test failures are detected and handled using REQ
and REQva
macros. TEST_FAILED()
is provided for those rare cases in which using REQ/REQva
is inconvenient or even inappropriate (a failure which is not tied to a particular requirement or a group of requirements).
Example:
if (system_behaves_wrong) { TEST_FAILED("The following parameters of the system " "have wrong values: ..."); }
Synopsis:
T2C_GET_DATA_PATH(<relative_path>
);
This macro constructs the path to the specified test data (file or directory) and returns the resulting string (see also Section 6.5, “Access to test data directories from the code”). NULL is returned if there is not enough memory to perform the operation.
The test data directory depends on the T2C file generation target used. It is <$SUITE_ROOT_DIR$>/testdata/<$T2C_FILE_NAME$>
(see Section 5.7.2, “Special parameters defined by the file generator”) for c_tet, c_standalone and c_gtest_nested targets. For c_minimal target, it is the current working directory.
<relative_path>
is a path to the data relative to the test data directory.
The macro does not check if the resulting path or any of its parts exist.
The resulting path is absolute for c_tet, c_standalone and c_gtest_nested targets and relative for c_minimal target.
The returned pointer should be freed when no longer needed.
Example:
char* path = T2C_GET_DATA_PATH("gttmodule.so"); if (path == NULL) { ABORT_TEST("Out of memory."); } // Now the path can be used in the test ... // Free memory occupied by the path // when it is no longer needed. free(path);
T2C_GET_DATA_PATH()
can be used in any section of a .t2c file that contains source code in the target programming language (C or C++ in case of c_* targets), not only in test scenarios.
Synopsis:
T2C_GET_TEST_PATH();
T2C_GET_TEST_PATH()
returns the path to the executable file containing the current test.
The resulting path is absolute for c_tet, c_standalone and c_gtest_nested targets and relative for c_minimal target.
The returned pointer should be freed when no longer needed.
Example:
char* self = T2C_GET_TEST_PATH(); if (path == NULL) { ABORT_TEST("Out of memory."); } // Now the path can be used in the test TRACE("Path to this test: %s", path); ... // Free memory occupied by the path // when it is no longer needed. free(path);
T2C_GET_TEST_PATH()
can be used in any section of a .t2c file that contains source code in the target programming language (C or C++ in case of c_* targets), not only in test scenarios.
This section contains various notes on building and installing the tests generated using particular T2C targets.
The process of building and installing the test suites created using c_tet and c_standalone targets is quite similar. They use different “backends” to execute the tests and collect their results (TETWare Lite and the simple testing framework provided by T2C) and the generated sources of the tests reflect that. But, from the user's point of view, the process of building and installing the test suite is quite similar. Unless specifically stated, what is stated in this section, refers to the both targets.
The goal of these targets is to provide a package that may contain several test suites at once and can be built and installed using configure - make - make install process. The makefiles in the test package are indended to be used with GNU make.
The typical structure of a single test suite is shown here.
The following sample top-level files of the test package are distributed with T2C (see other/
subdirectory of T2C source package), these files are not installed along with T2C, just distributed. The files should be placed to the root directory of the test package.
common/gen_files.sh
When called with target name as an argument (for example, gen_files.sh c_tet), the script traverses all subdirectories with names ending with “-t2c” in the directory where it is located. Each such directory is considered a test suite. The script invokes T2C file generator to create appropriate files for the suite (both common and per-test ones) for the specified target. It also creates directory t2c_api/
and copies there the sources of appropriate API library from T2C data directory.
<target>
/
gen_files.sh clean removes the files generated or copied to the test suite by gen_files.sh <target>
.
<target>
/configure.ac
A sample configure.ac
file needed to use Autotools for the test package (see the tutorial for details). The name, version and bug report address of your test package should be specified there instead of the sample values. Additional checks for prerequisites and system parameters could also be added in this file for a real test suite.
<target>
/Makefile.am
A sample top-level Makefile.am
file also needed to use Autotools for the test package (see the tutorial for details). Autotools will create Makefile.in
file from it and configure will use the latter to produce the top-level Makefile for the suite. Specific instructions needed to build, install and uninstall the tests should be provided in Makefile.am
.
<target>
/LICENSE
Just a stub of a file that should contain the license of this test package.
<target>
/post_install.sh
A stub of a script to be executed automatically after installation of the tests (Makefile
generated from Makefile.am
will be responsible for this). The script can be used to perform various post-installation tasks without listing them directly in Makefile.am
.
<target>
/pre_uninstall.sh
A stub of a script to be executed automatically before uninstallation of the tests (Makefile
generated from Makefile.am
will be responsible for this). The script can be used to perform various pre-uninstallation tasks without listing them directly in Makefile.am
.
<target>
/run_tests.sh
A sample script to execute the tests using the corresponding testing framework.
c_tet/tet_code
(c_tet target only) This file defines additional test verdicts (result codes) that TETWare tools should recognize. Normally, you will not need to change this file. Otherwise, please consult TETWare documentation on how to do this properly.
c_tet/tetexec.cfg
(c_tet target only) Additional paramaters for TETWare test execution tools are specified in this file. Normally, you will not need to change this file. Otherwise, please consult TETWare documentation on how to do this properly.
After the source files, Makefiles and other files have been generated (using gen_files.sh>
script or the file generator directly), the test package should be processed by autoconf and automake to produce the remaining files necessary to build and install the tests. One way to do this is to execute the following command from the root directory of the test package (the very directory where configure.ac
is located):
autoreconf --install
Among other things, configure
script and the top-level Makefile
will be created. Now the source package with the tests can be prepared:
./configure && make distcheck
If these commands succeed, a .tar.bz2 archive (the test source package) will be created, ready for distribution.
To build and install the tests from such package, say, sample_tests-4.1.0.tar.bz2
, you can do the following:
Unpack the archive:
tar xfj sample_tests-4.1.0.tar.bz2
Change current directory to sample_tests-4.1.0
and execute
./configure --prefix <path_to_install>
to install the tests to the specified location (<path_to_install>
).
configure can also be called when some other directory is current, VPATH builds are supported by default.
You can specify which C or C++ compiler to use for the tests and T2C API by setting variables CC
and CXX
, for example:
./configure --prefix /opt/tests CC=/opt/lsb/bin/lsbcc CXX=/opt/lsb/bin/lsbc++
The tests are intended to be built with GCC 3.4.2 or newer or with a compiler compatible with GCC.
Compiler and linker options used for building the tests are specified in the configuration file of the test suite and in the template of the per-test makefile. That is, in c_tet and c_standalone targets, they cannot be changed after the tests have been generated. Neither can they be changed by specifying variables CFLAGS
, CPPFLAGS
, CXXFLAGS
, etc., when calling configure.
Build the tests:
make
As usual, make clean can be used for cleanup.
If the package contains several test suites, for example, in subdirectories glib-t2c
, gobject-t2c
, gtk-t2c
, and so on, it is also possible to build just a partucular suite:
make build_suite_gobject-t2c
This will build only “gobject-t2c” test suite. That is, you should simply prepend “build_suite_” to the name of the suite and use the result as an argument to make. To clean up the suite, the process is similar but with “clean_suite_” prepended to the name of the suite:
make clean_suite_gobject-t2c
To build the tests prepared for c_tet target, header files and libraries provided by TETWare should be available in $TET_ROOT/inc/tet3/
and $TET_ROOT/lib/tet3/
, respectively. TET_ROOT
environment variable should be defined accordingly.
To install the tests, execute
make install
The tests should now be installed and ready to be executed.
When building binary packages (like rpm, deb, etc.), it can be helpful to be able to install the tests to a special location instead of the one specified when configure was called. As usual with Autotools, DESTDIR-trick is supported for this purpose. If DESTDIR
variable is defined when calling make install, the tests will be installed to ${DESTDIR}${prefix}
rather than to ${prefix}
directory. Example:
make DESTDIR=/tmp/pkgbuild install
To uninstall the tests, execute
make uninstall
DESTDIR
can be used when uninstalling as well.
If DESTDIR
is defined, the pre-install and post-uninstall scripts will not be executed automatically when installing or uninstalling the tests. The scripts can be executed manually in these cases if necessary. This is because these scripts are usually intended to run on the target system (where the tests should be finally installed) rather than on the intermediate system used to build the binary packages with the tests.
If DESTDIR
is not used but it is desirable to suppress execution of pre-install and post-uninstall scripts for some reason, T2C_NO_SCRIPTS
variable can be set to any non-empty value:
make T2C_NO_SCRIPTS="yes" install
c_gtest_nested target allows to create an Autotools-compatible test package that can be nested in the package under test or can be used separately.
The only top-level file of the test package distributed with T2C for this target is gen_files.sh
script (see c_gtest_nested/other/
subdirectory of T2C source package). It is not installed along with T2C, just distributed. The script should be placed to the root directory of the test package.
When called with name of the target as an argument (gen_files.sh c_gtest_nested), this script invokes T2C file generator to create appropriate files for the suite (both common and per-test ones). It also creates directory t2c_api/c_gtest_nested/
and copies the sources of the appropriate T2C API library there.
gen_files.sh clean deletes all files generated by t2c as well as t2c_api
directory with all its contents.
Note that in this target, configure.ac
and top-level Makefile.am
are also generated by t2c using appropriate templates. This allows to use the features of template-based file generation such as parameter substitution, generation of list-like structures based on the list of test groups (each prepared from a .t2c file), etc.
This target also requires that m4
and testdata
subdirectories exist in the root directory of the test suite.
m4
subdirectory can be left empty, Autotools will place appropriate files there when autoreconf is called (see below).
testdata
subdirectory can be used to store the data needed by the tests (see Section 6.5, “Access to test data directories from the code” for details.) The subdirectory should at least contain Makefile.am
file. This file may contain instructions to prepare the data for before the tests are executed via make check or make test. If this is not necessary, the file can be simply left blank.
After the source files, Makefiles and other files have been generated (using gen_files.sh
script or the file generator directly), the test package should be processed by autoconf and automake to produce the remaining files necessary to build and install the tests. If the test package is nested in the package under test, this will be done by calling these tools for the package under test. This will happen if the nested package is properly specified in AC_CONFIG_SUBDIRS
in configure.ac
and SUBDIRS
in Makefile
in the package under test - see the tutorial for details and examples.
If you would like to use the test package separately, that is, not nesting it in the package under test, the following command called from the root directory of the test package with execute appropriate autotools:
autoreconf --install
Among other things, configure
script and Makefiles for the suite and for each group of tests will be created.
If the test package is nested in the package under test, the former will be configured and built when these operations are performed for the latter. They will usually reside in a single archive.
If the test package is to be used separately, the following commands can be used to prepare the package:
./configure && make distcheck
If these commands succeed, a .tar.bz2 archive (the test source package) will be created, ready for distribution.
To build the tests from such package, the usual “configure - make” process can be used. VPATH builds are also supported.
As usual with Autotools-compatible packages, custom compiler and linker options can be specified in CFLAGS
, CXXFLAGS
, LDFLAGS
and other variables when configuring the package.
The tests prepared using c_gtest_nested target are not intended to be installed. When built, they are ready to be executed via make check.
There are no separate files with T2C API implementation for this target (everything is in the source file of the tests). Neither are there any auxiliary files.
The goal is to provide just a C/C++ source + makefile for it. That is, no complex test harness, only a plain C or C++ program containing little except the contents of the corresponding .t2c file. This can be used to debug the tests, to thoroughly analyse the behaviour of the system under test in some particular cases, to provide example programs for bug reports, etc.
Therefore many features are not supported in c_minimal target including requirement catalogues, execution in separate processes, time limits, etc.
To build the group of tests generated from a .t2c file using c_minimal target, change current directory to where the source file of this group of tests is located (usually tests/
subdirectory in the test suite) and execute make.
<t2c_file_name>
/
make clean can be used for cleaning up, make debug - for building the tests with compiler options better suitable for debugging.
This section contains various notes on execution of the tests generated using particular T2C targets.
The tests are executed using TETWare Lite tools in this case. These tools require TET_ROOT
environment variable to be defined. This variable should contain the path to the directory where TETWare Lite is installed. For example, the test case controller (tcc) should be located in ${TET_ROOT}/bin/
directory.
To run the tests from all suites in the test package, execute run_tests.sh script. This sample script is provided with T2C in the same package and should be copied to the test package before using (see the description of the auxiliary files in Section 4.2.1, “c_tet and c_standalone”). The script will call TETWare tools in appropriate environment and with appropriate parameters.
To run the tests from a particular suite contained in the package, specify the name of the suite (the name of the corresponding subdirectory) as an argument to run_tests.sh. For example, assume the test package contains three test suites: glib-t2c
, gobject-t2c
and pango-t2c
. The following command will execute only the tests from gobject-t2c
:
./run_tests.sh gobject-t2c
Actually, a part of the name is also acceptable. For example, the following command will execute the tests from gobject-t2c
and pango-t2c
:
./run_tests.sh go
The results of the tests are in a log file in special format - a so called TET journal. It is located in results/0001e/
subdirectory of the main directory of the test package. Consult TETWare manuals on how to read and analyse such journals.
Similar to c_tet target, the either all the tests or the tests from a particular suite can be executed using run_tests.sh script.
The results of the tests (including all the messages they output via TRACE
or the like) are collected in results/test.journal
file. A fragment of such file is presented below.
Executing the tests from the following group: /home/spectre/tests/temp/desktop-t2c/glib-t2c/tests/glib_arrays/glib_arrays --------------------------------------------------------------------------- [Startup begins] [Startup ends] --------------------------------------------------------------------------- <...skipped...> --------------------------------------------------------------------------- [Test #30 begins] Test target(s): g_array_set_size g_array_new g_array_sized_new -------- Checked requirement: {g_array_set_size.02} Checked requirement: {g_array_set_size.02} Checked requirement: {g_array_set_size.01} Checked requirement: {g_array_set_size.08} Checked requirement: {g_array_set_size.03} Checked requirement: {g_array_new.03} [Test #30 ends] [Result] PASSED --------------------------------------------------------------------------- <...skipped...> --------------------------------------------------------------------------- [Cleanup begins] [Cleanup ends] ---------------------------------------------------------------------------
Currently, no special means are provided to calculate how many tests passed or failed, etc. To find this out, the following commands can be used:
grep "^\[Result\] PASSED" test.journal | wc -l
The command above calculates the number of the individual tests that passed. Replace “PASSED” with “FAILED”, “UNRESOLVED” or any other supported test verdict to find out the number of tests that ended with this verdict.
The total number of test executed can be calculated in a similar way:
grep "^\[Result\]" test.journal | wc -l
A group of tests generated from the same .t2c file can be executed separately if necessary. The application containing the group of tests is located in tests/
subdirectory of the test suite and has the same name as the group. Example: <group_name>
/<…>/glib-t2c/tests/glib_arrays/glib_arrays
.
This application can be executed in an ordinary way from the directory where it resides:
./glib_arrays
This will run all the tests from “glib_arrays” group. The results (in the same format as in the example above) will be stored in a .log file in the same directory.
It is possible to run only selected tests from the group. To do this, specify the indexes of the tests to execute as the arguments. Example:
./glib_arrays 10 15 1 7
Here, the tests #10, #15, #1 and #7 will be executed, in this order. The code from <STARTUP> section of the corresponding .t2c file will be executed before the first of these tests (#10), the code from <CLEANUP> section - after the last one (#7).
The index corresponds to abs_test_index
parameter used by test_case
and test_group
template groups (see Section 2.7.1.2, “Test numbering”).
The first test in the group has index of 1. The index of each test can be found in the source file of the group of tests, in the comments before the test. Look for “This is the test #…” sentence there. Example:
/////////////////////////////////////////////////////////////////// // This is the test #30 // // Target(s): // g_module_supported // // Parameters: // none ///////////////////////////////////////////////////////////////////
A group of tests prepared using c_minimal target can be executed in a similar way as the one prepared using c_standalone (see above). That is, if you run the executable built for a group of tests, all tests from this group will be executed. If the indexes of the tests are listed as the arguments of the test executable, only these tests will run, in the order the indexes are specified.
No log file is used in c_minimal target. The messages from the tests are simply output to stdout
and stderr
.
If the test package is nested in the package under test, the tests will be run when make check is executed for the latter which is often enough.
In addition make check or make test (these commands are equivalent) can be executed from the root directory of the test package to run all the tests. make will report failure if at least one individual test fails.
A log file for each group of tests will be stored in XML-based format in the same directory as the test executable. The log file has extension “xml” and the same name as that directory (in fact, it is the same name as of the .t2c file the tests were generated from).
To generate a complete report on this test run, execute make test-report from the root directory of the test package. This way test-report.xml
will be assembled from the logs for the groups of tests and test-report.html
will be created - a report in HTML format that can be viewed in a web browser.
make test and make test-report can be called for a single group of tests as well to run the tests and prepare reports in XML and HTML formats for this group only. To do this, you can execute these commands from the directory where the executable for this group resides (tests/
subdirectory of the test package).
<group_name>
make test will run the tests from this test group and create log file
in the same directory.
<group_name>
.xml
make test-report will run the same instructions as make test first (“test” is a prerequisite for “test-report” - see the Makefile.am) and then it will prepare the report in HTML format (
) from the log file.
<group_name>
.html
The reports in HTML format are prepared from XML files by gtester-report tool. As of the end of 2009, the tool has a problem: it ignores some of the messages output by the tests. The detailed description of the problem and the patches that fix it are available at Gnome Bugzilla, ticket #592419.
Presented in this section are various notes concerning the usage of particular file generation targets.
If a group of tests is mentioned below, it refers to the tests the source of which is generated from the same .t2c file. The name of the group is the name of that file.
Copyright information and license notes can be specified using parameters of the test suite, COPYRIGHT_HOLDER
and LICENSE
. The templates for c_tet, c_standalone and c_gtest_nested targets refer to these parameters, so the corresponding information will appear in the generated source files.
These two parameters are by no means specific. That is, the file generator treats them in the same way as, for example, the parameters described in Section 4.4.2, “Configuration parameters of the tests” or any user-defined parameters.
Like most of the parameters, COPYRIGHT_HOLDER
and LICENSE
are usually specified for the whole test suite in the .cfg file of the suite. They can also be overridden if necessary for a particular group of tests (in the .cfg file for this group of tests).
Suppose you set COPYRIGHT_HOLDER
and LICENSE
parameters the following way in the configuration file of the suite:
COPYRIGHT_HOLDER = MyGreatCompany, Inc LICENSE =>> Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. <<
Note the use of “=>>” and “<<” in the definition of LICENSE
parameter. The lines between these markers form the value of this parameter.
As a result, each test source file created from a .t2c file for this test suite will have a notice in the comments at its beginning similar to the following one:
/************************************************************************** Copyright (C) 2010 MyGreatCompany, Inc. All rights reserved. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. **************************************************************************/
Target-specific parameters of the tests are described here. These parameters can be specified in the configuration file of a test suite and, if applicable, in the configuration files of the groups of tests. If a parameter is not specified, it is considered to have an empty value.
The users can also specify other parameters (including multiline ones) in the configuration files. These parameters will be handled in the same way as the ones described here: substituted instead of the appropriate placeholders in the templates when files are being generated. This can be helpful especially when some templates have been overridden by the user-defined ones (see Section 5, “Templates, Template Groups and File Generation with T2C” for more on template overriding and inheritance). The user-defined templates may refer to these newly added parameters while the default ones will simply ignore them.
COMPILER_FLAGS
Additional options to pass to the compiler when building the tests. This parameter is used in per-test Makefiles.
LINKER_FLAGS
Additional options to pass to the linker when building the tests. This parameter is used in per-test Makefiles.
TEST_SOURCE_EXT
The extension of the file names that should be used for the generated source files. These files usually have some predefined name or the same name as the corresponding .t2c file. If TEST_SOURCE_EXT
is not defined or is empty, “c” extension will be used.
# The generated source files will have "cxx" extension. TEST_SOURCE_EXT = cxx
WAIT_TIME
If each test from a group runs in a separate process, this parameters specifies the maximum amount of time (in seconds) an individual test is allowed to execute. If the test does not finishes when this time expires, it is aborted automatically. The test result (verdict) will indicate a failure in this case, the exact verdict depends on the file generation target used.
If WAIT_TIME
is 0 or negative, it means there is no time limit on test execution.
Default value: 30.
This parameter is ignored in c_minimal target, there is no time limit on test execution there. WAIT_TIME
is also ignored when SINGLE_PROCESS
is set to a non-zero value (see below).
SINGLE_PROCESS
If 0, each test from a group of tests is executed in a separate process (default for c_tet and c_standalone targets). Otherwise all the tests are executed in the same process.
Default value: 0.
This parameter is ignored in c_minimal target where all tests from a group are executed in the same process.
An example of a configuration file for the test suite is shown below:
# Custom compiler options COMPILER_FLAGS=`pkg-config --cflags glib-2.0` # Custom linker options LINKER_FLAGS=`pkg-config --libs glib-2.0` # Maximum time in seconds each test is allowed to run WAIT_TIME=60 # Copyright holder COPYRIGHT_HOLDER = Some Company # A boilerplate notice to appear at the top of each test source file # generated by T2C for this test suite. LICENSE =>> Licensed under some license. <<
TEST_SUITE_NAME
Name of the test suite to be used in configure.ac
.
TEST_SUITE_VERSION
Version string of the test suite to be used in configure.ac
.
TEST_SUITE_BUGREPORT
Bug report address of the test suite to be used in configure.ac
(typically, email or web address).
PACKAGE_UNDER_TEST
Name of the package under test. It will appear in the comments in the generated source files, in the description of their contents.
COMMON_COMPILER_FLAGS
, COMMON_LINKER_FLAGS
, COMMON_LIBS
Extra options for the compiler and linker common for all the tests in the suite. They will be evaluated by configure and the results will end up in the appropriate *_CFLAGS
, *_LDFLAGS
and *_LDADD
variables, respectively, in the Makefiles for the tests.
Note that overriding these parameters in a .cfg file for an individual group of tests will probably have no effect: normally they are only used in configure
in conjunction with AC_SUBST
.
To provide additional test-specific options, specify them in the following parameters in the .cfg file for that group of tests: ADD_COMPILER_FLAGS
, ADD_LINKER_FLAGS
, ADD_LIBS
(see below)
ADD_COMPILER_FLAGS
, ADD_LINKER_FLAGS
, ADD_LIBS
Additional test-specific options for compiler and linker and the additional libraries and object files to link with. These options will be included into the values of the appropriate *_CFLAGS
, *_LDFLAGS
and *_LDADD
variables, respectively, in the Makefiles for the tests.
These parameters are usually set in the configuration files for the groups of tests rather for the test suite.
BASE_PACKAGE_SRC_DIR
and BASE_PACKAGE_INC_DIR
Source and include directories of the package under test (can be the same). BASE_PACKAGE_INC_DIR
will be automatically added to the list of “include-directories” supplied to the compiler. These parameters can be helpful if the test package is to be used nested in the package under test.
$(top_srcdir)
can be used in these parameters to refer to the top source directory of the test package rather than of the the package under test.
ADD_SOURCES
and ADD_HEADERS
Space-separated lists of additional source and header files for a particular group of tests. If specified, these will be added to the *_SOURCES
variable in the Makefile for this group of tests. This can be helpful if some functionality needed by the tests is implemented not in the corresponding .t2c file but rather in separate source and header files.
These parameters are usually set in the configuration files for the groups of tests rather for the test suite.
TEST_SOURCE_EXT
The extension of the file names that should be used for the generated source files. These files usually have some predefined name or the same name as the corresponding .t2c file. If TEST_SOURCE_EXT
is not defined or is empty, “c” extension will be used.
# The generated source files will have "cxx" extension. TEST_SOURCE_EXT = cxx
WAIT_TIME
If each test from a group runs in a separate process, this parameters specifies the maximum amount of time (in seconds) an individual test is allowed to execute. If the test does not finishes when this time expires, it is aborted automatically. The test result (verdict) will indicate a failure in this case, the exact verdict depends on the file generation target used.
If WAIT_TIME
is 0 or negative, it means there is no time limit on test execution.
Default value: 30.
WAIT_TIME
is ignored when SINGLE_PROCESS
is set to a non-zero value (see below).
SINGLE_PROCESS
If 0, each test from a group of tests is executed in a separate process (default for c_tet and c_standalone targets). Otherwise all the tests are executed in the same process.
Default value: 0.
An example of a configuration file for the test suite is shown below:
############################################################################ # mysuite.cfg - common configuration parameters for the test suite # for "MySuperLibrary" package ############################################################################ # Name of the test suite TEST_SUITE_NAME = mysuite-t2c # Version of the test suite TEST_SUITE_VERSION = 0.1.0_alpha2 # Bug report address of the test suite (typically, email or web address) TEST_SUITE_BUGREPORT = http://bugtracker.somesite.org/sample-tests/ # Copyright holder (will appear in the generated sources, etc.) COPYRIGHT_HOLDER = Some Company Ltd # Name of the package under test (it will appear in the comments only) PACKAGE_UNDER_TEST = My Super Library 3.1.x # Extra options for the compiler and linker common for all the tests in the # suite. COMMON_COMPILER_FLAGS = -DCHECK_EXT_REQS \ `pkg-config --cflags MySuperLibrary-3.1 gtk+-2.0` COMMON_LINKER_FLAGS = COMMON_LIBS = `pkg-config --libs MySuperLibrary-3.1 gtk+-2.0` # To provide additional test-specific options, specify them in the following # parameters in the .cfg file for the respective group of tests: # ADD_COMPILER_FLAGS, ADD_LINKER_FLAGS, ADD_LIBS # Source and include directories of the package under test (can be the same) # [NB] $(top_srcdir) is used here to refer to the top source directory of # the test package rather than the package under test. BASE_PACKAGE_SRC_DIR = $(top_srcdir)/../../msl/src BASE_PACKAGE_INC_DIR = $(top_srcdir)/../../msl/include # These parameters are usually provided in the .cfg file of the group of tests. #ADD_SOURCES = #ADD_HEADERS = # A boilerplate notice to appear at the top of each test source file # generated by T2C for this test suite. LICENSE =>> Licensed under some license. <<
If you would like to develop a new T2C target (“profile”), following the procedure described below can be helpful.
First of all, you should decide which conditions the target and the test suites created using it are intended for:
Determine the platform(s) (hardware architecture, operating system, etc.) the test suites created using this T2C target should build and run on.
Choose the programming language(s) the source code of the tests will be written in.
Decide whether the tests should support a particular testing framework (like xUnit systems, TETWare, etc.).
At this point, it is recommended to choose the name for the T2C target. It is desirable (but not mandatory) that the name somehow reflects the conditions for which the target is being developed. For example, a name like “cpp_cppunit_win” could indicate that the tests prepared using this target are written in C++ language, use CppUnit testing framework and build and run on Microsoft Windows systems. “sh_standalone” could imply that the tests are actually shell scripts and require no external testing framework and so on.
Design the structure of the test suites that will be created using this T2C target. The decisions made at the previous step are taken into account here. At this step, you should determine which files (if any) should be generated by t2c in common mode (the files specific to the whole test suite like top-level Makefile
or configure.ac
) and per-test mode (the files specific to the group of tests defined in a .t2c file - the sources of the tests, Makefiles for the groups of tests, etc.).
See Section 3, “T2C Command Line Tool” for more information on how t2c operates in common and per-test modes.
Decide whether to provide target-specific API to use in the tests or the tests should rely only on API provided by the testing framework in use (if there is such API). Usually it is necessary to have API for at least the following tasks: checking what needs to be checked (functional requirements, performance, etc.), reporting success or failure, output of diagnostic messages. If API provided by the chosen testing framework suits your needs, you may develop the T2C target without providing any additional API.
Sometimes it can be beneficial to implement target-specific API to further automate common operations in the tests. If the tests are written in C or C++ language, it can be a complete or partial implementation of T2C API described in Section 4.1, “C API provided by T2C targets”, or it can be something different.
Prepare the implementation of target-specific API (if you have chosen to provide it) and the templates for the files to be generated. These tasks are usualy closely connected to each other: the API can be partially or completely implemented in the templates (API can be implemented in separate files as well). In addition, the code you write in the templates may require some convenience API that should be provided by the T2C target.
The templates should be organized using the directory structure described in Section 5.6, “Directory structure of the templates for a T2C target”.
Decide whether the test suites need auxiliary files that are not to be generated by t2c, for example, convenience scripts to build and execute the tests, to analyze the results, etc. Such files should also be distributed with the target.
After the above steps are completed, the target is ready to be used with t2c (see the description of relevant options in Section 3, “T2C Command Line Tool”).