Commit 1dca16cc authored by Michael Carilli's avatar Michael Carilli
Browse files

Making O1 the default opt level

parent b82c6bd7
...@@ -195,7 +195,7 @@ def initialize( ...@@ -195,7 +195,7 @@ def initialize(
models, models,
optimizers=None, optimizers=None,
enabled=True, enabled=True,
opt_level=None, opt_level="O1",
cast_model_type=None, cast_model_type=None,
patch_torch_functions=None, patch_torch_functions=None,
keep_batchnorm_fp32=None, keep_batchnorm_fp32=None,
...@@ -233,7 +233,7 @@ def initialize( ...@@ -233,7 +233,7 @@ def initialize(
REQUIRED for training, optional for inference. REQUIRED for training, optional for inference.
enabled (bool, optional, default=True): If False, renders all Amp calls no-ops, so your script enabled (bool, optional, default=True): If False, renders all Amp calls no-ops, so your script
should run as if Amp were not present. should run as if Amp were not present.
opt_level (str, required): Pure or mixed precision optimization level. Accepted values are opt_level (str, optional, default="O1"): Pure or mixed precision optimization level. Accepted values are
"O0", "O1", "O2", and "O3", explained in detail above. "O0", "O1", "O2", and "O3", explained in detail above.
cast_model_type (``torch.dtype``, optional, default=None): Optional property override, see cast_model_type (``torch.dtype``, optional, default=None): Optional property override, see
above. above.
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment