Unverified Commit 7b07e123 authored by Stas Bekman's avatar Stas Bekman Committed by GitHub
Browse files

doc fix (#651)


Co-authored-by: default avatarJeff Rasley <jerasley@microsoft.com>
parent e729a3fd
......@@ -54,6 +54,7 @@ title: "DeepSpeed Configuration JSON"
}
```
The Adam optimizer also supports the following two params keys/values in addition to the standard parameters from [torch.optim.Adam](https://pytorch.org/docs/stable/_modules/torch/optim/adam.html#Adam):
| "params" key | Description | Default |
| ------------- | --------------------------------------------------------------------------- | --------|
| torch\_adam | Use torch's implementation of adam instead of our fused adam implementation | false |
......@@ -97,7 +98,7 @@ Example of ***scheduler***
"warmup_max_lr": 0.001,
"warmup_num_steps": 1000
}
}
}
```
### Communication options
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment