Unverified Commit c5e42641 authored by Stas Bekman's avatar Stas Bekman Committed by GitHub
Browse files

Add AdamW to the supported optimizers (#672)


Co-authored-by: default avatarJeff Rasley <jerasley@microsoft.com>
parent 6217a6c2
...@@ -34,7 +34,7 @@ title: "DeepSpeed Configuration JSON" ...@@ -34,7 +34,7 @@ title: "DeepSpeed Configuration JSON"
| Fields | Value | Example | | Fields | Value | Example |
| ------ | ------------------------------------------------------------ | ------------------------------ | | ------ | ------------------------------------------------------------ | ------------------------------ |
| type | The optimizer name. DeepSpeed natively supports **Adam**, **OneBitAdam**, and **Lamb** optimizers and will import other optimizers from [torch](https://pytorch.org/docs/stable/optim.html). | `"Adam"` | | type | The optimizer name. DeepSpeed natively supports **Adam**, **AdamW**, **OneBitAdam**, and **Lamb** optimizers and will import other optimizers from [torch](https://pytorch.org/docs/stable/optim.html). | `"Adam"` |
| params | Dictionary of parameters to instantiate optimizer. The parameter names must match the optimizer constructor signature (e.g., for [Adam](https://pytorch.org/docs/stable/optim.html#torch.optim.Adam)). | `{"lr": 0.001, "eps": 1e-8}` | | params | Dictionary of parameters to instantiate optimizer. The parameter names must match the optimizer constructor signature (e.g., for [Adam](https://pytorch.org/docs/stable/optim.html#torch.optim.Adam)). | `{"lr": 0.001, "eps": 1e-8}` |
Example of ***optimizer*** with Adam Example of ***optimizer*** with Adam
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment