Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
58022e41
Unverified
Commit
58022e41
authored
May 30, 2023
by
Vijeth Moudgalya
Committed by
GitHub
May 30, 2023
Browse files
#23388 Issue: Update RoBERTa configuration (#23863)
parent
6fc0454b
Changes
2
Hide whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
4 additions
and
4 deletions
+4
-4
src/transformers/models/roberta/configuration_roberta.py
src/transformers/models/roberta/configuration_roberta.py
+2
-2
src/transformers/models/roberta_prelayernorm/configuration_roberta_prelayernorm.py
...oberta_prelayernorm/configuration_roberta_prelayernorm.py
+2
-2
No files found.
src/transformers/models/roberta/configuration_roberta.py
View file @
58022e41
...
...
@@ -46,7 +46,7 @@ class RobertaConfig(PretrainedConfig):
Args:
vocab_size (`int`, *optional*, defaults to
30522
):
vocab_size (`int`, *optional*, defaults to
50265
):
Vocabulary size of the RoBERTa model. Defines the number of different tokens that can be represented by the
`inputs_ids` passed when calling [`RobertaModel`] or [`TFRobertaModel`].
hidden_size (`int`, *optional*, defaults to 768):
...
...
@@ -105,7 +105,7 @@ class RobertaConfig(PretrainedConfig):
def
__init__
(
self
,
vocab_size
=
30522
,
vocab_size
=
50265
,
hidden_size
=
768
,
num_hidden_layers
=
12
,
num_attention_heads
=
12
,
...
...
src/transformers/models/roberta_prelayernorm/configuration_roberta_prelayernorm.py
View file @
58022e41
...
...
@@ -45,7 +45,7 @@ class RobertaPreLayerNormConfig(PretrainedConfig):
Args:
vocab_size (`int`, *optional*, defaults to
30522
):
vocab_size (`int`, *optional*, defaults to
50265
):
Vocabulary size of the RoBERTa-PreLayerNorm model. Defines the number of different tokens that can be
represented by the `inputs_ids` passed when calling [`RobertaPreLayerNormModel`] or
[`TFRobertaPreLayerNormModel`].
...
...
@@ -106,7 +106,7 @@ class RobertaPreLayerNormConfig(PretrainedConfig):
def
__init__
(
self
,
vocab_size
=
30522
,
vocab_size
=
50265
,
hidden_size
=
768
,
num_hidden_layers
=
12
,
num_attention_heads
=
12
,
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment