"git@developer.sourcefind.cn:chenpangpang/transformers.git" did not exist on "1fa2d89a9bb98a15e9720190e07d272a42f03d28"
Unverified Commit 70d57118 authored by mathor's avatar mathor Committed by GitHub
Browse files

Fix a writing issue in the comments of trainer.py (#14202)

parent 33fb9833
...@@ -244,7 +244,7 @@ class Trainer: ...@@ -244,7 +244,7 @@ class Trainer:
detailed in :doc:`here <callback>`. detailed in :doc:`here <callback>`.
If you want to remove one of the default callbacks used, use the :meth:`Trainer.remove_callback` method. If you want to remove one of the default callbacks used, use the :meth:`Trainer.remove_callback` method.
optimizers (:obj:`Tuple[torch.optim.Optimizer, torch.optim.lr_scheduler.LambdaLR`, `optional`): A tuple optimizers (:obj:`Tuple[torch.optim.Optimizer, torch.optim.lr_scheduler.LambdaLR]`, `optional`): A tuple
containing the optimizer and the scheduler to use. Will default to an instance of containing the optimizer and the scheduler to use. Will default to an instance of
:class:`~transformers.AdamW` on your model and a scheduler given by :class:`~transformers.AdamW` on your model and a scheduler given by
:func:`~transformers.get_linear_schedule_with_warmup` controlled by :obj:`args`. :func:`~transformers.get_linear_schedule_with_warmup` controlled by :obj:`args`.
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment