Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
fairscale
Commits
4a63034e
"...text-generation-inference.git" did not exist on "aac64ddaea91f6d342566c5a47cfb53c487eb769"
Unverified
Commit
4a63034e
authored
Jun 25, 2021
by
Mehdi Mirzazadeh
Committed by
GitHub
Jun 25, 2021
Browse files
checking number parameters in distributed pipeline test (#728)
parent
bcd4748d
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
6 additions
and
2 deletions
+6
-2
tests/experimental/nn/test_multiprocess_pipe.py
tests/experimental/nn/test_multiprocess_pipe.py
+6
-2
No files found.
tests/experimental/nn/test_multiprocess_pipe.py
View file @
4a63034e
...
@@ -243,7 +243,9 @@ def multi_input_multi_output_layers(devices):
...
@@ -243,7 +243,9 @@ def multi_input_multi_output_layers(devices):
pipe
=
DistributedPipeline
(
graph
,
chunks
=
4
)
pipe
=
DistributedPipeline
(
graph
,
chunks
=
4
)
assert
[[
0
,
1
],
[
2
],
[
3
],
[
4
]]
==
extract_partitions
(
graph
,
pipe
)
assert
[[
0
,
1
],
[
2
],
[
3
],
[
4
]]
==
extract_partitions
(
graph
,
pipe
)
opt
=
DistributedOptimizer
(
torch
.
optim
.
SGD
,
pipe
.
parameter_rrefs
(),
lr
=
0.05
,)
parameter_rrefs
=
pipe
.
parameter_rrefs
()
assert
len
(
parameter_rrefs
)
==
6
opt
=
DistributedOptimizer
(
torch
.
optim
.
SGD
,
parameter_rrefs
,
lr
=
0.05
,)
losses
=
[]
losses
=
[]
for
i
in
range
(
2
):
for
i
in
range
(
2
):
with
dist_autograd
.
context
()
as
context_id
:
with
dist_autograd
.
context
()
as
context_id
:
...
@@ -293,7 +295,9 @@ def auto_graph_extract(devices):
...
@@ -293,7 +295,9 @@ def auto_graph_extract(devices):
pipe
=
DistributedPipeline
(
graph
,
chunks
=
4
)
pipe
=
DistributedPipeline
(
graph
,
chunks
=
4
)
partitions
=
extract_partitions
(
graph
,
pipe
)
partitions
=
extract_partitions
(
graph
,
pipe
)
assert
[[
0
,
1
],
[
2
],
[
3
],
[
4
]]
==
partitions
,
f
"partitions=
{
partitions
}
"
assert
[[
0
,
1
],
[
2
],
[
3
],
[
4
]]
==
partitions
,
f
"partitions=
{
partitions
}
"
opt
=
DistributedOptimizer
(
torch
.
optim
.
SGD
,
pipe
.
parameter_rrefs
(),
lr
=
0.05
,)
parameter_rrefs
=
pipe
.
parameter_rrefs
()
assert
len
(
parameter_rrefs
)
==
6
opt
=
DistributedOptimizer
(
torch
.
optim
.
SGD
,
parameter_rrefs
,
lr
=
0.05
,)
losses
=
[]
losses
=
[]
for
i
in
range
(
2
):
for
i
in
range
(
2
):
with
dist_autograd
.
context
()
as
context_id
:
with
dist_autograd
.
context
()
as
context_id
:
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment