• R茅mi Louf's avatar
    replace LambdaLR scheduler wrappers by function · 022525b0
    R茅mi Louf authored
    Custom schedulers are currently initiated by wrapping Pytorch's LambdaLR
    class and passing a method of the wrapping class to the __init__
    function of LambdaLR. This approach is not appropriate for several
    reasons:
    
    1. one does not need to define a class when it only defines a
    __init__() method;
    2. instantiating the parent class by passing a method of the child class
    creates a cyclical reference which leads to memory leaks. See issues #1742 and #1134.
    
    In this commit we replace the wrapper classes with functions that
    instantiate `LambdaLR` with a custom learning rate function. We use a
    closure to specify the parameter of the latter. We also do a bit of
    renaming within the function to explicit the behaviour and removed
    docstrings that were subsequently not necessary.
    022525b0
optimization.py 7.48 KB