`apex.amp` is a tool to enable mixed precision training by changing only 3 lines of your script.
`apex.amp` is a tool to enable mixed precision training by changing only 3 lines of your script.
Users can easily experiment with different pure and mixed precision training modes by supplying
Users can easily experiment with different pure and mixed precision training modes by supplying
...
@@ -27,7 +27,7 @@ different flags to `amp.initialize`.
...
@@ -27,7 +27,7 @@ different flags to `amp.initialize`.
[DCGAN example coming soon...](https://github.com/NVIDIA/apex/tree/master/examples/dcgan)
[DCGAN example coming soon...](https://github.com/NVIDIA/apex/tree/master/examples/dcgan)
[Moving to the new Amp API] (for users of the deprecated tools formerly called "Amp" and "FP16_Optimizer")
[Moving to the new Amp API](https://nvidia.github.io/apex/amp.html#transition-guide-for-old-api-users)(for users of the deprecated tools formerly called "Amp" and "FP16_Optimizer")
- ``keep_batchnorm_fp32``: To enhance precision and enable cudnn batchnorm (which improves performance), it'softenbeneficialtokeepbatchnormsinparticularinFP32eveniftherestofthemodelisFP16.
-``keep_batchnorm_fp32``:Toenhanceprecisionandenablecudnnbatchnorm(whichimprovesperformance),it's often beneficial to keep batchnorm weights in FP32 even if the rest of the model is FP16.
- ``master_weights``: Maintain FP32 master weights to accompany any FP16 model weights. FP32 master weights are stepped by the optimizer to enhance precision and capture small gradients.
- ``master_weights``: Maintain FP32 master weights to accompany any FP16 model weights. FP32 master weights are stepped by the optimizer to enhance precision and capture small gradients.
- ``loss_scale``: If ``loss_scale`` is a float value, use this value as the static (fixed) loss scale. If ``loss_scale`` is the string ``"dynamic"``, adaptively adjust the loss scale over time. Dynamic loss scale adjustments are performed by Amp automatically.
Again, you often don'tneedtospecifythesepropertiesbyhand.Instead,selectan``opt_level``,
Again, you often don'tneedtospecifythesepropertiesbyhand.Instead,selectan``opt_level``,