Commit 2a703773 authored by Sylvain Gugger's avatar Sylvain Gugger
Browse files

Fix style

parent cd5565be
......@@ -106,7 +106,7 @@ class TrainingArguments:
learning_rate (:obj:`float`, `optional`, defaults to 5e-5):
The initial learning rate for :class:`~transformers.AdamW` optimizer.
weight_decay (:obj:`float`, `optional`, defaults to 0):
The weight decay to apply (if not zero) to all layers except all bias and LayerNorm weights in
The weight decay to apply (if not zero) to all layers except all bias and LayerNorm weights in
:class:`~transformers.AdamW` optimizer.
adam_beta1 (:obj:`float`, `optional`, defaults to 0.9):
The beta1 hyperparameter for the :class:`~transformers.AdamW` optimizer.
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment