classy.optim.factories
Classes
AdafactorWithWarmupFactory
Factory for AdaFactor optimizer with warmup learning rate scheduler reference paper for Adafactor: https://arxiv.org/abs/1804.04235
__init__
AdagradWithWarmupFactory
Factory for Adagrad optimizer with warmup learning rate scheduler reference paper for Adagrad: https://jmlr.org/papers/v12/duchi11a.html
__init__
AdamWWithWarmupFactory
Factory for AdamW optimizer with warmup learning rate scheduler reference paper for AdamW: https://arxiv.org/abs/1711.05101
__init__
AdamWithWarmupFactory
Factory for Adam optimizer with warmup learning rate scheduler reference paper for Adam: https://arxiv.org/abs/1412.6980
__init__
Factory
Factory interface that allows for simple instantiation of optimizers and schedulers for PyTorch Lightning. This class is essentially a work-around for lazy instantiation: * all params but for the module to be optimized are received in init * the actual instantiation of optimizers and schedulers takes place in the call method, where the module to be optimized is provided call will be invoked in the configure_optimizers hooks of LighiningModule-s and its return object directly returned. As such, the return type of call can be any of those allowed by configure_optimizers, namely: * Single optimizer * List or Tuple - List of optimizers * Two lists - The first list has multiple optimizers, the second a list of LR schedulers (or lr_dict) * Dictionary, with an ‘optimizer’ key, and (optionally) a ‘lr_scheduler’ key whose value is a single LR scheduler or lr_dict * Tuple of dictionaries as described, with an optional ‘frequency’ key * None - Fit will run without any optimizer
Subclasses (2)
RAdamFactory
Factory for RAdam optimizer reference paper for RAdam: https://arxiv.org/abs/1908.03265
__init__
TorchFactory
Simple factory wrapping standard PyTorch optimizers and schedulers.
WeightDecayOptimizer
Factory interface that allows for simple instantiation of optimizers and schedulers for PyTorch Lightning. This class is essentially a work-around for lazy instantiation: * all params but for the module to be optimized are received in init * the actual instantiation of optimizers and schedulers takes place in the call method, where the module to be optimized is provided call will be invoked in the configure_optimizers hooks of LighiningModule-s and its return object directly returned. As such, the return type of call can be any of those allowed by configure_optimizers, namely: * Single optimizer * List or Tuple - List of optimizers * Two lists - The first list has multiple optimizers, the second a list of LR schedulers (or lr_dict) * Dictionary, with an ‘optimizer’ key, and (optionally) a ‘lr_scheduler’ key whose value is a single LR scheduler or lr_dict * Tuple of dictionaries as described, with an optional ‘frequency’ key * None - Fit will run without any optimizer