Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
ColossalAI
Commits
d35bd7d0
Commit
d35bd7d0
authored
Jul 05, 2023
by
ver217
Committed by
Hongxin Liu
Aug 15, 2023
Browse files
[shardformer] fix type hint
parent
1ed3f8a2
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
2 additions
and
2 deletions
+2
-2
colossalai/shardformer/shard/shard_config.py
colossalai/shardformer/shard/shard_config.py
+2
-2
No files found.
colossalai/shardformer/shard/shard_config.py
View file @
d35bd7d0
...
...
@@ -15,8 +15,8 @@ class ShardConfig:
The config for sharding the huggingface model
Args:
tensor_parallel_process_group (
int
): The process group for tensor parallelism, defaults to None, which is the global process group.
pipeline_stage_manager (PipelineStageManager): The pipeline stage manager, defaults to None, which means no pipeline.
tensor_parallel_process_group (
Optional[ProcessGroup]
): The process group for tensor parallelism, defaults to None, which is the global process group.
pipeline_stage_manager (
Optional[
PipelineStageManager
]
): The pipeline stage manager, defaults to None, which means no pipeline.
enable_tensor_parallelism (bool): Whether to turn on tensor parallelism, default is True.
enable_fused_normalization (bool): Whether to use fused layernorm, default is False.
enable_all_optimization (bool): Whether to turn on all optimization, default is False.
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment