controllers module
Full Documentation for hippynn.experiment.controllers
module.
Click here for a summary page.
Managing optimizer, scheduler, and end of training
- class Controller(optimizer, scheduler, batch_size, max_epochs, stopping_key, eval_batch_size=None, fraction_train_eval=0.1, quiet=False)[source]
Bases:
object
Class for controlling the training dynamics.
- Parameters:
optimizer – pytorch optimizer (or compatible)
scheduler – pytorch scheduler (or compatible) Pass None to use no schedulers. May pass a list of schedulers to use them all in sequence.
scheduler_list – list of schedulers (may be empty).
stopping_key – str of name for best metric.
batch_size – batch size for training
eval_batch_size – batch size during evaluation
fraction_train_eval – the random fraction of the training set to use during evaluation.
quiet – If true, suppress printing.
max_epochs – maximum amount of epochs currently allowed based on training so far.
Note
If a scheduler defines a
set_controller
method, this will be run on the Controller instance. This allows a scheduler to modify other aspects of training besides the conventional ones found in the optimizer. SeeRaiseBatchSizeOnPlateau
in this module.- property max_epochs
- class PatienceController(*args, termination_patience, **kwargs)[source]
Bases:
Controller
A subclass of Controller that terminates if training has not improved for a given number of epochs.
- Variables:
patience – How many epochs are allowed without improvement before termination.
last_best – The eoch number of the last best epoch encountered.
- property max_epochs
- class RaiseBatchSizeOnPlateau(optimizer, max_batch_size, factor=0.5, patience=10, threshold=0.0001, threshold_mode='rel', verbose=None, controller=None)[source]
Bases:
ReduceLROnPlateau
Learning rate scheduler compatible with pytorch schedulers.
Note: The “VERBOSE” Parameter has been deprecated and no longer does anything.
This roughly implements the scheme outlined in the following paper:
"Don't Decay the Learning Rate, Increase the Batch Size" Samuel L. Smith et al., 2018. arXiv:1711.00489 Published as a conference paper at ICLR 2018.
Until max_batch_size has been reached. After that, the learning rate is decayed.
Note
To use this scheduler, build it, then link it to the container which governs the batch size using
set_controller
. The default base hippynn Controller will do this automatically.- state_dict()[source]
Return the state of the scheduler as a
dict
.It contains an entry for every variable in self.__dict__ which is not the optimizer.
- property optimizer