Skip to content

Algorithms

ExampleAlgorithm #

Bases: LightningModule

Example learning algorithm for image classification.

__init__ #

__init__(
    datamodule: ImageClassificationDataModule,
    network: _Config[Module],
    optimizer: _PartialConfig[Optimizer] = AdamConfig(
        lr=0.0003
    ),
    init_seed: int = 42,
)

Create a new instance of the algorithm.

Parameters:

Name Type Description Default
datamodule ImageClassificationDataModule

Object used to load train/val/test data. See the lightning docs for LightningDataModule for more info.

required
network _Config[Module]

The config of the network to instantiate and train.

required
optimizer _PartialConfig[Optimizer]

The config for the Optimizer. Instantiating this will return a function (a functools.partial) that will create the Optimizer given the hyper-parameters.

AdamConfig(lr=0.0003)
init_seed int

The seed to use when initializing the weights of the network.

42

HFExample #

Bases: LightningModule

Example of a lightning module used to train a huggingface model.

configure_optimizers #

configure_optimizers()

Prepare optimizer and schedule (linear warmup and decay)

JaxExample #

Bases: LightningModule

Example of a learning algorithm (LightningModule) that uses Jax.

In this case, the network is a flax.linen.Module, and its forward and backward passes are written in Jax, and the loss function is in pytorch.

HParams dataclass #

Hyper-parameters of the algo.

NoOp #

Bases: LightningModule

No-op algorithm that does no learning and is used to benchmark the dataloading speed.