Unverified Commit a99f7f5c authored by cronoik's avatar cronoik Committed by GitHub
Browse files

Minor typos fixed (#11182)

parent 26212c14
...@@ -52,7 +52,7 @@ class ReformerConfig(PretrainedConfig): ...@@ -52,7 +52,7 @@ class ReformerConfig(PretrainedConfig):
The standard deviation of the normal_initializer for initializing the weight matrices of the axial The standard deviation of the normal_initializer for initializing the weight matrices of the axial
positional encodings. positional encodings.
axial_pos_shape (:obj:`List[int]`, `optional`, defaults to :obj:`[64, 64]`): axial_pos_shape (:obj:`List[int]`, `optional`, defaults to :obj:`[64, 64]`):
The position dims of the axial position encodings. During training the product of the position dims has to The position dims of the axial position encodings. During training, the product of the position dims has to
be equal to the sequence length. be equal to the sequence length.
For more information on how axial position embeddings work, see `Axial Position Encodings For more information on how axial position embeddings work, see `Axial Position Encodings
...@@ -88,7 +88,7 @@ class ReformerConfig(PretrainedConfig): ...@@ -88,7 +88,7 @@ class ReformerConfig(PretrainedConfig):
initializer_range (:obj:`float`, `optional`, defaults to 0.02): initializer_range (:obj:`float`, `optional`, defaults to 0.02):
The standard deviation of the truncated_normal_initializer for initializing all weight matrices. The standard deviation of the truncated_normal_initializer for initializing all weight matrices.
is_decoder (:obj:`bool`, `optional`, defaults to :obj:`False`): is_decoder (:obj:`bool`, `optional`, defaults to :obj:`False`):
Whether ot not to use a causal mask in addition to the :obj:`attention_mask` passed to Whether or not to use a causal mask in addition to the :obj:`attention_mask` passed to
:class:`~transformers.ReformerModel`. When using the Reformer for causal language modeling, this argument :class:`~transformers.ReformerModel`. When using the Reformer for causal language modeling, this argument
should be set to :obj:`True`. should be set to :obj:`True`.
layer_norm_eps (:obj:`float`, `optional`, defaults to 1e-12): layer_norm_eps (:obj:`float`, `optional`, defaults to 1e-12):
...@@ -134,7 +134,7 @@ class ReformerConfig(PretrainedConfig): ...@@ -134,7 +134,7 @@ class ReformerConfig(PretrainedConfig):
pad_token_id (:obj:`int`, `optional`, defaults to 0): pad_token_id (:obj:`int`, `optional`, defaults to 0):
The token id for the padding token. The token id for the padding token.
vocab_size (:obj:`int`, `optional`, defaults to 320):\ vocab_size (:obj:`int`, `optional`, defaults to 320):\
Vocabulary size of the BERT model. Defines the number of different tokens that can be represented by the Vocabulary size of the Reformer model. Defines the number of different tokens that can be represented by the
:obj:`inputs_ids` passed when calling :class:`~transformers.ReformerModel`. :obj:`inputs_ids` passed when calling :class:`~transformers.ReformerModel`.
tie_word_embeddings (:obj:`bool`, `optional`, defaults to :obj:`False`): tie_word_embeddings (:obj:`bool`, `optional`, defaults to :obj:`False`):
Whether to tie input and output embeddings. Whether to tie input and output embeddings.
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment