• Vinicius Reis's avatar
    Fix LARC with mixed precision (#793) · 2ec84ebd
    Vinicius Reis authored
    The LARC optimizer wraps an underlying optimizer and then needs to be passed
    to amp.initialize for mixed precision. There were 3 different crashes happening
    in this situation, fix all of them and add a unit test.
    
    I don't know if the 'LARC' in sys.modules check ever worked. In my setup, the
    entry in sys.modules is 'apex.parallel.LARC'. Checking if the variable is
    defined seems more reliable though.
    2ec84ebd
_initialize.py 11.3 KB