Add "float16" and "bfloat16" precision when training with lightning Task
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/381 Introduce extra parameter SOLVER.AMP.PRECISION which can be sued to control the mixed precision training when lightning backend is used. Previous value `precision: "mixed"` was worng and the training failed (See screenshot below) {F777576618} I had to make AMP.PRECISION as string and make sure that it can work with two values: "float16" and "bfloat16". Before feeding it to the Trainer we convert "float16" string to integer value 16. Such a workaround was unavoidable because D2 (https://github.com/facebookresearch/d2go/commit/87374efb134e539090e0b5c476809dc35bf6aedb)Go's config value cannot be of int and str at the same time. Reviewed By: wat3rBro Differential Revision: D40035367 fbshipit-source-id: ed4f615ab29a2258164cbe179a9adba11559d804
Showing
Please register or sign in to comment