Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
71cced8a
Unverified
Commit
71cced8a
authored
May 23, 2022
by
vfbd
Committed by
GitHub
May 23, 2022
Browse files
OPTForCausalLM lm_head input size should be config.word_embed_proj_dim (#17225)
parent
56f50590
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
1 addition
and
1 deletion
+1
-1
src/transformers/models/opt/modeling_opt.py
src/transformers/models/opt/modeling_opt.py
+1
-1
No files found.
src/transformers/models/opt/modeling_opt.py
View file @
71cced8a
...
...
@@ -786,7 +786,7 @@ class OPTForCausalLM(OPTPreTrainedModel):
self
.
model
=
OPTModel
(
config
)
# the lm_head weight is automatically tied to the embed tokens weight
self
.
lm_head
=
nn
.
Linear
(
config
.
hidden_size
,
config
.
vocab_size
,
bias
=
False
)
self
.
lm_head
=
nn
.
Linear
(
config
.
word_embed_proj_dim
,
config
.
vocab_size
,
bias
=
False
)
# Initialize weights and apply final processing
self
.
post_init
()
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment