Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
Megatron-LM
Commits
accbe59f
Commit
accbe59f
authored
Nov 25, 2024
by
wxj
Browse files
Update transformer.py
parent
8ebbb6e3
Pipeline
#1962
passed with stage
Changes
1
Pipelines
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
3 additions
and
3 deletions
+3
-3
megatron/legacy/model/transformer.py
megatron/legacy/model/transformer.py
+3
-3
No files found.
megatron/legacy/model/transformer.py
View file @
accbe59f
...
...
@@ -1229,9 +1229,9 @@ class ParallelTransformerLayer(MegatronModule):
# hidden_states: [s, b, h]
# Layer norm at the beginning of the transformer layer.
from
unsloth.kernels.rms_layernorm
import
fast_rms_layernorm
norm_output
=
self
.
input_norm
(
hidden_states
)
if
not
args
.
use_fast_rms_layernorm
else
fast_rms_layernorm
(
self
.
input_norm
,
hidden_states
)
#
norm_output = self.input_norm(hidden_states)
#
from unsloth.kernels.rms_layernorm import fast_rms_layernorm
#
norm_output = self.input_norm(hidden_states) if not args.use_fast_rms_layernorm else fast_rms_layernorm(self.input_norm, hidden_states)
norm_output
=
self
.
input_norm
(
hidden_states
)
# Self attention.
attention_output
,
attention_bias
=
\
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment