Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
ColossalAI
Commits
1b491ad7
Unverified
Commit
1b491ad7
authored
Aug 19, 2022
by
Jiarui Fang
Committed by
GitHub
Aug 19, 2022
Browse files
[doc] update docstring in ProcessGroup (#1468)
parent
b73fb7a0
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
3 additions
and
3 deletions
+3
-3
colossalai/tensor/process_group.py
colossalai/tensor/process_group.py
+3
-3
No files found.
colossalai/tensor/process_group.py
View file @
1b491ad7
...
...
@@ -31,7 +31,7 @@ PYTORCHPGDICT_ = PyTorchProcessGroupDict()
class
ProcessGroup
:
"""ProcessGroup
Process Group
conta
in
s
group
partition for
Tensor Parallel and Data Parallel.
Process Group
indicates how processes are organized
in group
s for parallel execution using
Tensor Parallel
ism
and Data Parallel
ism
.
NOTE, the ProcessGroup must be used after `torch.distributed.initialize()`
...
...
@@ -40,8 +40,8 @@ class ProcessGroup:
rank: the global rank of the current process.
ranks: List[int], a list of rank id belongings to this process group.
backend: str, the backend of the process group.
tp_degree: Optional[int], tensor parallelism degree
,
default None means 1
dp_degree: Optional[int], data parallelism degree
,
default None means len(ranks)
tp_degree: Optional[int], tensor parallelism degree
. How many processes are inside a tp process group.
default None means 1
.
dp_degree: Optional[int], data parallelism degree
. How many processes are inside a dp process group. .
default None means len(ranks)
.
"""
def
__init__
(
self
,
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment