Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
619200cc
"docs/source/ms/index.md" did not exist on "49ab16239c74ccfca2298472868caadb1d2c3878"
Unverified
Commit
619200cc
authored
May 06, 2021
by
Stas Bekman
Committed by
GitHub
May 06, 2021
Browse files
[cuda ext tests] fixing tests (#11619)
* fixing tests * cleanup
parent
44c5621d
Changes
3
Show whitespace changes
Inline
Side-by-side
Showing
3 changed files
with
8 additions
and
5 deletions
+8
-5
.github/workflows/self-scheduled.yml
.github/workflows/self-scheduled.yml
+2
-0
tests/deepspeed/test_deepspeed.py
tests/deepspeed/test_deepspeed.py
+4
-3
tests/extended/test_trainer_ext.py
tests/extended/test_trainer_ext.py
+2
-2
No files found.
.github/workflows/self-scheduled.yml
View file @
619200cc
...
...
@@ -261,6 +261,7 @@ jobs:
-
name
:
Install dependencies
run
:
|
apt -y update && apt install -y libaio-dev
pip install --upgrade pip
pip install .[testing,deepspeed]
...
...
@@ -301,6 +302,7 @@ jobs:
-
name
:
Install dependencies
run
:
|
apt -y update && apt install -y libaio-dev
pip install --upgrade pip
pip install .[testing,deepspeed,fairscale]
...
...
tests/deepspeed/test_deepspeed.py
View file @
619200cc
...
...
@@ -318,9 +318,10 @@ class TrainerIntegrationDeepSpeed(TestCasePlus, TrainerIntegrationCommon):
yes_grad_accum_b
=
yes_grad_accum_trainer
.
model
.
b
.
item
()
self
.
assertNotEqual
(
yes_grad_accum_a
,
a
)
# training with half the batch size but accumulation steps as 2 should give the same weights
self
.
assertEqual
(
no_grad_accum_a
,
yes_grad_accum_a
)
self
.
assertEqual
(
no_grad_accum_b
,
yes_grad_accum_b
)
# training with half the batch size but accumulation steps as 2 should give the same
# weights, but sometimes get a slight difference still of 1e-6
self
.
assertAlmostEqual
(
no_grad_accum_a
,
yes_grad_accum_a
,
places
=
5
)
self
.
assertAlmostEqual
(
no_grad_accum_b
,
yes_grad_accum_b
,
places
=
5
)
# see the note above how to get identical loss on a small bs
self
.
assertAlmostEqual
(
no_grad_accum_loss
,
yes_grad_accum_loss
,
places
=
5
)
...
...
tests/extended/test_trainer_ext.py
View file @
619200cc
...
...
@@ -167,8 +167,8 @@ class TestTrainerExt(TestCasePlus):
# test if do_predict saves generations and metrics
contents
=
os
.
listdir
(
output_dir
)
contents
=
{
os
.
path
.
basename
(
p
)
for
p
in
contents
}
assert
"
test_
generations.txt"
in
contents
assert
"
tes
t_results.json"
in
contents
assert
"generat
ed_predict
ions.txt"
in
contents
assert
"
predic
t_results.json"
in
contents
def
run_trainer
(
self
,
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment