Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
ColossalAI
Commits
44178041
Unverified
Commit
44178041
authored
Jul 25, 2022
by
HELSON
Committed by
GitHub
Jul 25, 2022
Browse files
[unit test] add megatron init test in zero_optim (#1358)
parent
7a065dc9
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
3 additions
and
2 deletions
+3
-2
tests/test_tensor/test_zero_optim.py
tests/test_tensor/test_zero_optim.py
+3
-2
No files found.
tests/test_tensor/test_zero_optim.py
View file @
44178041
...
...
@@ -18,6 +18,7 @@ from colossalai.testing import parameterize
from
colossalai.amp
import
convert_to_apex_amp
from
colossalai.gemini.gemini_mgr
import
GeminiManager
from
colossalai.tensor
import
ColoTensorSpec
,
ShardSpec
,
ComputePattern
,
ComputeSpec
,
ProcessGroup
,
ColoTensor
from
tests.test_tensor.model.test_gpt2
import
init_megatron_spec
def
check_param_equal
(
model
,
torch_model
,
pg
:
ProcessGroup
):
...
...
@@ -127,10 +128,10 @@ def run_dist(rank, world_size, port):
config
=
{}
colossalai
.
launch
(
config
=
config
,
rank
=
rank
,
world_size
=
world_size
,
host
=
'localhost'
,
port
=
port
,
backend
=
'nccl'
)
if
world_size
==
4
:
run_gpt
(
tp_init_spec_func
=
init_1d_col_spec
)
run_gpt
(
tp_init_spec_func
=
init_1d_row_spec
)
run_gpt
(
tp_init_spec_func
=
init_megatron_spec
)
else
:
run_gpt
(
tp_init_spec_func
=
init_1d_col_spec
)
run_gpt
(
tp_init_spec_func
=
init_1d_row_spec
)
@
pytest
.
mark
.
dist
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment