openfold.utils.lr_schedulers¶
Classes
|
Implements the learning rate schedule defined in the AlphaFold 2 supplement. |
- class AlphaFoldLRScheduler(optimizer, last_epoch=-1, verbose=False, base_lr=0.0, max_lr=0.001, warmup_no_steps=1000, start_decay_after_n_steps=50000, decay_every_n_steps=50000, decay_factor=0.95)¶
Bases:
_LRSchedulerImplements the learning rate schedule defined in the AlphaFold 2 supplement. A linear warmup is followed by a plateau at the maximum learning rate and then exponential decay.
Note that the initial learning rate of the optimizer in question is ignored; use this class’ base_lr parameter to specify the starting point of the warmup.
- Parameters:
- get_lr()¶
- load_state_dict(state_dict)¶
- state_dict()¶