Commit 3f4fc501 authored by Jerry Ma's avatar Jerry Ma Committed by Facebook Github Bot
Browse files

Miscellaneous documentation improvements: (#868)

Summary:
- More clearly document the correspondence between FairseqAdam and torch.optim.AdamW
- Add ResamplingDataset to Sphinx docs
Pull Request resolved: https://github.com/fairinternal/fairseq-py/pull/868

Differential Revision: D17523244

Pulled By: jma127

fbshipit-source-id: 8e7b34b24889b2c8f70b09a52a625d2af135734b
parent 3b09b98b
...@@ -30,6 +30,8 @@ provide additional functionality: ...@@ -30,6 +30,8 @@ provide additional functionality:
:members: :members:
.. autoclass:: fairseq.data.ConcatDataset .. autoclass:: fairseq.data.ConcatDataset
:members: :members:
.. autoclass:: fairseq.data.ResamplingDataset
:members:
.. autoclass:: fairseq.data.RoundRobinZipDatasets .. autoclass:: fairseq.data.RoundRobinZipDatasets
:members: :members:
.. autoclass:: fairseq.data.TransformEosDataset .. autoclass:: fairseq.data.TransformEosDataset
......
...@@ -15,6 +15,12 @@ from . import FairseqOptimizer, register_optimizer ...@@ -15,6 +15,12 @@ from . import FairseqOptimizer, register_optimizer
@register_optimizer('adam') @register_optimizer('adam')
class FairseqAdam(FairseqOptimizer): class FairseqAdam(FairseqOptimizer):
"""Adam optimizer for fairseq.
Important note: this optimizer corresponds to the "AdamW" variant of
Adam in its weight decay behavior. As such, it is most closely
analogous to torch.optim.AdamW from PyTorch.
"""
def __init__(self, args, params): def __init__(self, args, params):
super().__init__(args) super().__init__(args)
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment