FTorch test suite

  • Joe Wallwork
  • Last Updated: January 2026

Testing

FTorch's test suite includes unit tests of components, and integration tests based on a subset of the worked examples.

Building

To enable FTorch's test suite, set CMAKE_BUILD_TESTS=TRUE during the build. To run the unit tests, you will need to install pFUnit and provide its install location to the CMAKE_PREFIX_PATH or as the environment variable PFUNIT_DIR:

# Update `CMAKE_PREFIX_PATH` explicitly with pFUnit install directory
cmake -DCMAKE_PREFIX_PATH="</path/to/pFUnit/build/installed/PFUNIT-VERSION>" \
    -DCMAKE_BUILD_TESTS=True -S </path/to/FTorch> -B </path/to/FTorch/build>

or

# Using an environment variable with pFUnit install directory
export PFUNIT_DIR=</path/to/pFUnit/build/installed/PFUNIT-VERSION>
cmake -DCMAKE_BUILD_TESTS=TRUE -S </path/to/FTorch> -B </path/to/FTorch/build>

Note that pFUnit includes the version number in the install directory name, so for version 4.12 that path will need to be specified as /path/to/pFUnit/build/installed/PFUNIT-4.12, for example.

Note

If a GPU_DEVICE is specified but only one is available, set MULTI_GPU=OFF to skip the 'multiple GPU devices' integration test.

Building with tests enabled will automatically install any Python dependencies for the examples, so should be executed from within a virtual environment.1 If this is not the case it will fail with appropriate warnings.

Note that, whilst example 5_Looping is built if CMAKE_BUILD_TESTS=TRUE is specified, it is not run as part of the integration test suite because it demonstrates 'good' versus 'bad' practice, as opposed to functionality.

Running

Ensure that the Python virtual environment used when building is active and then run ctest from the build directory to execute all tests. Use CTest arguments for greater control over the testing configuration.

This will produce a report on which of the requested tests passed, and which, if any, failed for your build.

Unit tests

Unit tests may be executed in the following ways:

  1. To run all unit tests, use ctest -R unittest.
    (ctest -R accepts a regular expression to search for and this works because all unit tests start with unittest.)
  2. Use a different regular expression to select a subset of unit tests, e.g., ctest -R unittest_tensor to run all unit tests related to torch_tensors.
  3. To run a specific unit test just use its full name, e.g. ctest -R unittest_tensor_constructors_destructors.

For a summary of the current unit tests, see the README.

Integration tests

Integration tests may be executed in the following ways:

  1. To run just the integration tests, use ctest -R example.
    (ctest -R accepts a regular expression to search for and this works because all integration tests start with example.)
  2. To run a specific integration test use ctest -R example2, for example.
    Alternatively navigate to the corresponding example in ${BUILD_DIR}/examples and call ctest.

Contributing tests

Contributing unit tests

New components should come with unit tests written using the pFUnit framework.

  • New unit tests should be added to the test/unit/ directory, have names that start with unittest_ and use the .pf extension.
  • Unit test files should include a module with the same name as the file (minus the extension).
  • If MPI parallelism is required for the unit test, use the pfunit module that comes with pFUnit, otherwise its funit module should be sufficient.
  • Tests contained within the module should be written as subroutines with the @test decorator and name starting with test_.
  • pFUnit's @assertTrue, @assertFalse, and @assertEqual decorators should be used to check expected outcomes.
  • In some cases, it can be useful to use parameterisation. This can be achieved by defining a derived type of AbstractTestParameter and giving it the @testParameter decorator. Additionally, define a derived type of ParameterizedTestCase and give it the @testCase decorator. The test case acts as a constructor that determines how to pass the parameters to the tests. Where parameterisation is used, parameter sets should be defined as functons to create instances of the derived test parameter type. Parameterised tests should pass the parameters through the @test decorator. For example, fortran @test(testparameters={get_parameters()}) where get_parameters is such a function as mentioned above. See existing unit tests for examples of this.
  • New unit tests should be included in test/unit/CMakeLists.txt in order to be built as part of the test suite.
  • Don't forget to add a brief summary of the new unit test in the unit test README.

Contributing integration tests

New functionalities should come with integration tests in the form of worked examples.

  • These should take the form of a new example in the examples/ directory.
  • Create a subdirectory named with the next sequential number and a descriptive name, e.g. 9_NewFeature.
  • In addition to a CMakeLists.txt to build the example code there should also be a section at the end setting up running of the example using CTest.
    • Integration test names should start with example followed by the example number, e.g. example9.
  • New examples will also need including in examples/CMakeLists.txt
  • Ensure the documentation on worked examples is updated accordingly.

  1. If you built FTorch against LibTorch (rather than creating a virtual environment) then you will need to create a virtual environment for the purposes of integration testing as this script will install packages into your Python environment and will check that a virtual environment is in use.