Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
ColossalAI
Commits
73a4144b
Commit
73a4144b
authored
Aug 15, 2023
by
ver217
Committed by
Hongxin Liu
Aug 15, 2023
Browse files
[shardformer] fix embedding
parent
92230226
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
3 additions
and
0 deletions
+3
-0
colossalai/shardformer/layer/embedding.py
colossalai/shardformer/layer/embedding.py
+3
-0
No files found.
colossalai/shardformer/layer/embedding.py
View file @
73a4144b
...
...
@@ -214,6 +214,9 @@ class VocabParallelEmbedding1D(ParallelModule):
self
.
vocab_start_index
=
tensor_parallel_rank
*
self
.
num_embeddings_per_partition
self
.
vocab_end_index
=
self
.
vocab_start_index
+
self
.
num_embeddings_per_partition
# padding index
self
.
padding_idx
=
self
.
_select_padding_idx
(
padding_idx
)
# offset the seed with randomizer index and rank
seed
=
torch
.
random
.
initial_seed
()
self
.
randomizer
=
create_randomizer_with_offset
(
seed
,
process_group
=
self
.
process_group
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment