"pytorch/git@developer.sourcefind.cn:OpenDAS/fastmoe.git" did not exist on "3a0086fafb2790d259b4f1c25cb6c454ecce41a4"
Use nonexperimental mixed precision API for official models.
For all modified calls to set_mixed_precision_policy(), the loss_scale argument was removed, as it cannot be passed if the nonexperimental API is used. For all such callers, the loss_scale is later used to explicitly create a LossScaleOptimizer, so removing the argument has no impact. Switching to the non-experimental LossScaleOptimizer has no effect, as it has near identical behavior and all isinstance checks within the official models check for the non-experimental version. PiperOrigin-RevId: 368101975
Showing
Please register or sign in to comment