Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
6739e1d2
Unverified
Commit
6739e1d2
authored
May 23, 2024
by
Poedator
Committed by
GitHub
May 23, 2024
Browse files
test_custom_4d_attention_mask skip with sliding window attn (#30833)
parent
87a35181
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
2 additions
and
0 deletions
+2
-0
tests/test_modeling_common.py
tests/test_modeling_common.py
+2
-0
No files found.
tests/test_modeling_common.py
View file @
6739e1d2
...
@@ -4407,6 +4407,8 @@ class ModelTesterMixin:
...
@@ -4407,6 +4407,8 @@ class ModelTesterMixin:
if
not
model_class
.
_supports_static_cache
:
if
not
model_class
.
_supports_static_cache
:
self
.
skipTest
(
f
"
{
model_class
.
__name__
}
is not guaranteed to work with custom 4D attention masks"
)
self
.
skipTest
(
f
"
{
model_class
.
__name__
}
is not guaranteed to work with custom 4D attention masks"
)
config
,
_
=
self
.
model_tester
.
prepare_config_and_inputs_for_common
()
config
,
_
=
self
.
model_tester
.
prepare_config_and_inputs_for_common
()
if
getattr
(
config
,
"sliding_window"
,
0
)
>
0
:
self
.
skipTest
(
f
"
{
model_class
.
__name__
}
with sliding window attention is not supported by this test"
)
model
=
model_class
(
config
).
to
(
device
=
torch_device
,
dtype
=
torch
.
float32
)
model
=
model_class
(
config
).
to
(
device
=
torch_device
,
dtype
=
torch
.
float32
)
(
(
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment