Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
612b2a1a
Unverified
Commit
612b2a1a
authored
Jun 07, 2023
by
Joao Gante
Committed by
GitHub
Jun 07, 2023
Browse files
Generate: increase left-padding test atol (#23448)
increase atol
parent
f1660d7e
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
1 addition
and
2 deletions
+1
-2
tests/generation/test_utils.py
tests/generation/test_utils.py
+1
-2
No files found.
tests/generation/test_utils.py
View file @
612b2a1a
...
...
@@ -1608,7 +1608,6 @@ class GenerationTesterMixin:
attn_weights
=
out
[
attn_name
]
if
attn_name
==
attention_names
[
0
]
else
out
[
attn_name
][
-
1
]
self
.
assertEqual
(
sum
([
w
.
sum
().
item
()
for
w
in
attn_weights
]),
0.0
)
@
slow
# TODO (Joao): fix GPTBigCode
def
test_left_padding_compatibility
(
self
):
# The check done in this test is fairly difficult -- depending on the model architecture, passing the right
# position index for the position embeddings can still result in a different output, due to numerical masking.
...
...
@@ -1648,7 +1647,7 @@ class GenerationTesterMixin:
position_ids
.
masked_fill_
(
padded_attention_mask
==
0
,
1
)
model_kwargs
[
"position_ids"
]
=
position_ids
next_logits_with_padding
=
model
(
**
model_kwargs
).
logits
[:,
-
1
,
:]
if
not
torch
.
allclose
(
next_logits_wo_padding
,
next_logits_with_padding
):
if
not
torch
.
allclose
(
next_logits_wo_padding
,
next_logits_with_padding
,
atol
=
1e-7
):
no_failures
=
False
break
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment