"...git@developer.sourcefind.cn:chenpangpang/transformers.git" did not exist on "927904bc911225c1ea6087c5e24aa77d265bfb9e"
Unverified Commit acfaad74 authored by Stas Bekman's avatar Stas Bekman Committed by GitHub
Browse files

[docstring] missing arg (#6933)



* [docstring] missing arg

add the missing `tie_word_embeddings` entry

* cleanup

* Update src/transformers/configuration_reformer.py
Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
parent c3317e1f
...@@ -115,6 +115,8 @@ class ReformerConfig(PretrainedConfig): ...@@ -115,6 +115,8 @@ class ReformerConfig(PretrainedConfig):
vocab_size (:obj:`int`, optional, defaults to 320): vocab_size (:obj:`int`, optional, defaults to 320):
Vocabulary size of the Reformer model. Defines the different tokens that Vocabulary size of the Reformer model. Defines the different tokens that
can be represented by the `inputs_ids` passed to the forward method of :class:`~transformers.ReformerModel`. can be represented by the `inputs_ids` passed to the forward method of :class:`~transformers.ReformerModel`.
tie_word_embeddings (:obj:`bool`, `optional`, defaults to :obj:`False`):
Whether to tie input and output embeddings.
Example:: Example::
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment