Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
d9daeff2
"...git@developer.sourcefind.cn:chenpangpang/transformers.git" did not exist on "9112520b15722fe113c9119f0c4524653e37becc"
Unverified
Commit
d9daeff2
authored
Jun 14, 2024
by
Yoach Lacombe
Committed by
GitHub
Jun 14, 2024
Browse files
Set seed for M4T retain grad test (#31419)
parent
43ee5858
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
4 additions
and
4 deletions
+4
-4
tests/models/seamless_m4t/test_modeling_seamless_m4t.py
tests/models/seamless_m4t/test_modeling_seamless_m4t.py
+4
-4
No files found.
tests/models/seamless_m4t/test_modeling_seamless_m4t.py
View file @
d9daeff2
...
@@ -612,11 +612,11 @@ class SeamlessM4TModelWithSpeechInputTest(ModelTesterMixin, unittest.TestCase):
...
@@ -612,11 +612,11 @@ class SeamlessM4TModelWithSpeechInputTest(ModelTesterMixin, unittest.TestCase):
[
self
.
model_tester
.
num_attention_heads
,
encoder_seq_length
,
encoder_key_length
],
[
self
.
model_tester
.
num_attention_heads
,
encoder_seq_length
,
encoder_key_length
],
)
)
@
unittest
.
skip
(
reason
=
"In training model, the first speech encoder layer is sometimes skipped. Training is not supported yet, so the test is ignored."
)
def
test_retain_grad_hidden_states_attentions
(
self
):
def
test_retain_grad_hidden_states_attentions
(
self
):
pass
# When training the model, the first speech encoder layer is sometimes skipped.
# Setting the seed to always have the first layer.
set_seed
(
0
)
super
().
test_retain_grad_hidden_states_attentions
()
@
require_torch
@
require_torch
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment