Skip to content

Testsuites

LightningModuleTests #

Bases: Generic[LightningModuleType], ABC

Suite of generic tests for a LightningModule.

Simply inherit from this class and decorate the class with the appropriate markers to get a set of decent unit tests that should apply to almost any LightningModule.

See the project.algorithms.image_classifier_test module for an example.

Other ideas: - pytest-benchmark for regression tests on forward / backward pass / training step speed - pytest-profiling for profiling the training step? (pytorch variant?) - Dataset splits: check some basic stats about the train/val/test inputs, are they somewhat similar? - Define the input as a space, check that the dataset samples are in that space and not too many samples are statistically OOD? - Test to monitor distributed traffic out of this process? - Dummy two-process tests (on CPU) to check before scaling up experiments?

config #

config(dict_config: DictConfig) -> Config

The experiment configuration, with all interpolations resolved.

algorithm #

algorithm(
    config: Config,
    datamodule: LightningDataModule | None,
    trainer: Trainer,
    device: device,
)

Fixture that creates the "algorithm" (usually a LightningModule).

make_torch_deterministic #

make_torch_deterministic()

Set torch to deterministic mode for unit tests that use the tensor_regression fixture.

seed #

seed(request: FixtureRequest)

Fixture that seeds everything for reproducibility and yields the random seed used.

training_step_content #

training_step_content(
    datamodule: LightningDataModule | None,
    algorithm: LightningModuleType,
    seed: int,
    accelerator: str,
    devices: int | list[int],
    tmp_path_factory: TempPathFactory,
)

Fixture that runs a training step and makes various things available for tests.

test_initialization_is_reproducible #

test_initialization_is_reproducible(
    training_step_content: StuffFromFirstTrainingStep,
    tensor_regression: TensorRegressionFixture,
    accelerator: str,
)

Check that the network initialization is reproducible given the same random seed.

test_forward_pass_is_reproducible #

test_forward_pass_is_reproducible(
    algorithm: LightningModuleType,
    training_step_content: StuffFromFirstTrainingStep,
    tensor_regression: TensorRegressionFixture,
)

Check that the forward pass is reproducible given the same input and random seed.

Note: There could be more than one call to forward inside a training step. Here we only check the args/kwargs/outputs of the first forward call for now.

test_backward_pass_is_reproducible #

test_backward_pass_is_reproducible(
    training_step_content: StuffFromFirstTrainingStep,
    tensor_regression: TensorRegressionFixture,
    accelerator: str,
)

Check that the backward pass is reproducible given the same weights, inputs and random seed.

test_update_is_reproducible #

test_update_is_reproducible(
    algorithm: LightningModuleType,
    training_step_content: StuffFromFirstTrainingStep,
    tensor_regression: TensorRegressionFixture,
    accelerator: str,
)

Check that the weights after one step of training are the same given the same seed.

do_one_step_of_training #

do_one_step_of_training(
    algorithm: LightningModuleType,
    datamodule: LightningDataModule | None,
    accelerator: str,
    devices: int | list[int] | Literal["auto"],
    callbacks: list[Callback],
    tmp_path: Path,
)

Performs one step of training.

Overwrite this if you train your algorithm differently.