Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
deb72cb6
Unverified
Commit
deb72cb6
authored
Dec 15, 2023
by
Yoach Lacombe
Committed by
GitHub
Dec 15, 2023
Browse files
Skip M4T `test_retain_grad_hidden_states_attentions` (#28060)
* skip test from SpeechInput * refine description of skip
parent
d269c4b2
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
5 additions
and
3 deletions
+5
-3
tests/models/seamless_m4t/test_modeling_seamless_m4t.py
tests/models/seamless_m4t/test_modeling_seamless_m4t.py
+5
-3
No files found.
tests/models/seamless_m4t/test_modeling_seamless_m4t.py
View file @
deb72cb6
...
...
@@ -20,7 +20,7 @@ import tempfile
import
unittest
from
transformers
import
SeamlessM4TConfig
,
is_speech_available
,
is_torch_available
from
transformers.testing_utils
import
is_flaky
,
require_torch
,
slow
,
torch_device
from
transformers.testing_utils
import
require_torch
,
slow
,
torch_device
from
transformers.trainer_utils
import
set_seed
from
transformers.utils
import
cached_property
...
...
@@ -610,9 +610,11 @@ class SeamlessM4TModelWithSpeechInputTest(ModelTesterMixin, unittest.TestCase):
[
self
.
model_tester
.
num_attention_heads
,
encoder_seq_length
,
encoder_key_length
],
)
@
is_flaky
()
@
unittest
.
skip
(
reason
=
"In training model, the first speech encoder layer is sometimes skipped. Training is not supported yet, so the test is ignored."
)
def
test_retain_grad_hidden_states_attentions
(
self
):
super
().
test_retain_grad_hidden_states_attentions
()
pass
@
require_torch
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment