Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
ColossalAI
Commits
f155ae89
Commit
f155ae89
authored
Jul 17, 2023
by
klhhhhh
Committed by
Hongxin Liu
Aug 15, 2023
Browse files
[shardformer] ChatGLM support layernorm sharding
parent
00f6ef15
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
1 addition
and
1 deletion
+1
-1
tests/kit/model_zoo/transformers/chatglm2_6b/modeling_chatglm.py
...it/model_zoo/transformers/chatglm2_6b/modeling_chatglm.py
+1
-1
No files found.
tests/kit/model_zoo/transformers/chatglm2_6b/modeling_chatglm.py
View file @
f155ae89
...
...
@@ -417,7 +417,7 @@ class SelfAttention(torch.nn.Module):
)
=======
self
.
dense
=
nn
.
Linear
(
self
.
projection_size
,
self
.
hidden_size
,
config
.
hidden_size
,
bias
=
config
.
add_bias_linear
,
device
=
device
,
**
_config_to_kwargs
(
config
))
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment