Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
6a7b9da2
"tests/speech_encoder_decoder/__init__.py" did not exist on "31c23bd5ee26425a67f92fc170789656379252a6"
Unverified
Commit
6a7b9da2
authored
Dec 23, 2021
by
Henrik Holm
Committed by
GitHub
Dec 23, 2021
Browse files
Add 'with torch.no_grad()' to integration test forward pass (#14808)
parent
d8c09c65
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
2 additions
and
1 deletion
+2
-1
tests/test_modeling_albert.py
tests/test_modeling_albert.py
+2
-1
No files found.
tests/test_modeling_albert.py
View file @
6a7b9da2
...
@@ -299,7 +299,8 @@ class AlbertModelIntegrationTest(unittest.TestCase):
...
@@ -299,7 +299,8 @@ class AlbertModelIntegrationTest(unittest.TestCase):
model
=
AlbertModel
.
from_pretrained
(
"albert-base-v2"
)
model
=
AlbertModel
.
from_pretrained
(
"albert-base-v2"
)
input_ids
=
torch
.
tensor
([[
0
,
345
,
232
,
328
,
740
,
140
,
1695
,
69
,
6078
,
1588
,
2
]])
input_ids
=
torch
.
tensor
([[
0
,
345
,
232
,
328
,
740
,
140
,
1695
,
69
,
6078
,
1588
,
2
]])
attention_mask
=
torch
.
tensor
([[
0
,
1
,
1
,
1
,
1
,
1
,
1
,
1
,
1
,
1
,
1
]])
attention_mask
=
torch
.
tensor
([[
0
,
1
,
1
,
1
,
1
,
1
,
1
,
1
,
1
,
1
,
1
]])
output
=
model
(
input_ids
,
attention_mask
=
attention_mask
)[
0
]
with
torch
.
no_grad
():
output
=
model
(
input_ids
,
attention_mask
=
attention_mask
)[
0
]
expected_shape
=
torch
.
Size
((
1
,
11
,
768
))
expected_shape
=
torch
.
Size
((
1
,
11
,
768
))
self
.
assertEqual
(
output
.
shape
,
expected_shape
)
self
.
assertEqual
(
output
.
shape
,
expected_shape
)
expected_slice
=
torch
.
tensor
(
expected_slice
=
torch
.
tensor
(
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment