Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
ColossalAI
Commits
e37f3db4
Unverified
Commit
e37f3db4
authored
Nov 30, 2022
by
HELSON
Committed by
GitHub
Nov 30, 2022
Browse files
[gemini] add arguments (#2046)
* [zero] fix testing parameters * [gemini] add arguments * add docstrings
parent
6a9158f1
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
16 additions
and
2 deletions
+16
-2
colossalai/nn/parallel/gemini_parallel.py
colossalai/nn/parallel/gemini_parallel.py
+16
-2
No files found.
colossalai/nn/parallel/gemini_parallel.py
View file @
e37f3db4
from
typing
import
Optional
import
torch
from
colossalai.gemini.chunk
import
init_chunk_manager
...
...
@@ -14,7 +16,9 @@ class GeminiDDP(ZeroDDP):
placement_policy
:
str
=
"cpu"
,
pin_memory
:
bool
=
False
,
force_outputs_fp32
:
bool
=
False
,
search_range_mb
:
int
=
32
)
->
None
:
search_range_mb
:
int
=
32
,
hidden_dim
:
Optional
[
int
]
=
None
,
min_chunk_size_mb
:
Optional
[
float
]
=
None
)
->
None
:
"""
A torch.Module warpper using ZeRO-DP and Genimi.
ZeRO is for parallel. Gemini is for memory management.
...
...
@@ -34,7 +38,17 @@ class GeminiDDP(ZeroDDP):
pin_memory (bool, optional): use pin memory on CPU. Defaults to False.
force_outputs_fp32 (bool, optional): force outputs are fp32. Defaults to False.
search_range_mb (int, optional): chunk size searching range in MegaByte. Defaults to 32.
hidden_dim (int, optional): the hidden dimension of DNN.
Users can provide this argument to speed up searching.
If users do not know this argument before training, it is ok. We will use a default value 1024.
min_chunk_size_mb (float, optional): the minimum chunk size in MegaByte.
If the aggregate size of parameters is still samller than the minimum chunk size,
all parameters will be compacted into one small chunk.
"""
chunk_manager
=
init_chunk_manager
(
model
=
module
,
init_device
=
device
,
search_range_mb
=
search_range_mb
)
chunk_manager
=
init_chunk_manager
(
model
=
module
,
init_device
=
device
,
hidden_dim
=
hidden_dim
,
search_range_mb
=
search_range_mb
,
min_chunk_size_mb
=
min_chunk_size_mb
)
gemini_manager
=
GeminiManager
(
placement_policy
,
chunk_manager
,
module
)
super
().
__init__
(
module
,
gemini_manager
,
pin_memory
,
force_outputs_fp32
)
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment