Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
24b930ad
Unverified
Commit
24b930ad
authored
Feb 22, 2023
by
Younes Belkada
Committed by
GitHub
Feb 22, 2023
Browse files
[`MBart`] Fix cross attention mask check (#21730)
fix typo
parent
5e8c8eb5
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
1 addition
and
1 deletion
+1
-1
src/transformers/models/mbart/modeling_mbart.py
src/transformers/models/mbart/modeling_mbart.py
+1
-1
No files found.
src/transformers/models/mbart/modeling_mbart.py
View file @
24b930ad
...
@@ -1055,7 +1055,7 @@ class MBartDecoder(MBartPreTrainedModel):
...
@@ -1055,7 +1055,7 @@ class MBartDecoder(MBartPreTrainedModel):
if
attn_mask
.
size
()[
0
]
!=
len
(
self
.
layers
):
if
attn_mask
.
size
()[
0
]
!=
len
(
self
.
layers
):
raise
ValueError
(
raise
ValueError
(
f
"The `
{
mask_name
}
` should be specified for
{
len
(
self
.
layers
)
}
layers, but it is for"
f
"The `
{
mask_name
}
` should be specified for
{
len
(
self
.
layers
)
}
layers, but it is for"
f
"
{
head
_mask
.
size
()[
0
]
}
."
f
"
{
attn
_mask
.
size
()[
0
]
}
."
)
)
for
idx
,
decoder_layer
in
enumerate
(
self
.
layers
):
for
idx
,
decoder_layer
in
enumerate
(
self
.
layers
):
# add LayerDrop (see https://arxiv.org/abs/1909.11556 for description)
# add LayerDrop (see https://arxiv.org/abs/1909.11556 for description)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment