Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
Megatron-LM
Commits
420eec74
Commit
420eec74
authored
Apr 03, 2023
by
Jared Casper
Browse files
Addressing comments.
parent
2cc3dac7
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
2 additions
and
2 deletions
+2
-2
megatron/model/transformer.py
megatron/model/transformer.py
+2
-2
No files found.
megatron/model/transformer.py
View file @
420eec74
...
...
@@ -140,7 +140,7 @@ class ParallelMLP(MegatronModule):
assert
self
.
activation_func
==
F
.
gelu
intermediate_parallel
=
bias_gelu_impl
(
intermediate_parallel
,
bias_parallel
)
else
:
if
self
.
add_bias
:
if
bias_parallel
is
not
None
:
intermediate_parallel
=
intermediate_parallel
+
bias_parallel
intermediate_parallel
=
self
.
activation_func
(
intermediate_parallel
)
...
...
@@ -674,7 +674,7 @@ class ParallelTransformerLayer(MegatronModule):
attention_type
=
AttnType
.
self_attn
,
attn_mask_type
=
self_attn_mask_type
)
self
.
hidden_dropout
=
args
.
hidden_dropout
self
.
bias_dropout_fusion
=
args
.
bias_dropout_fusion
and
args
.
add_bias_linear
self
.
bias_dropout_fusion
=
args
.
bias_dropout_fusion
self
.
drop_path
=
DropPath
(
drop_path_rate
)
if
drop_path_rate
>
0.0
else
None
# Layernorm on the attention output
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment