Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
Megatron-LM
Commits
031a4157
Commit
031a4157
authored
Jan 17, 2025
by
wxj
Browse files
Update transformer.py
parent
a45f4c2a
Pipeline
#2227
passed with stage
Changes
1
Pipelines
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
2 additions
and
2 deletions
+2
-2
megatron/legacy/model/transformer.py
megatron/legacy/model/transformer.py
+2
-2
No files found.
megatron/legacy/model/transformer.py
View file @
031a4157
...
...
@@ -165,7 +165,7 @@ class ParallelMLP(MegatronModule):
is_expert
=
is_expert
,
)
@
torch
.
compile
(
mode
=
"max-autotune-no-cudagraphs"
)
#
@torch.compile(mode="max-autotune-no-cudagraphs")
def
forward
(
self
,
hidden_states
):
# [s, b, 4hp]
...
...
@@ -1213,7 +1213,7 @@ class ParallelTransformerLayer(MegatronModule):
return
retriever_output
,
norm_input
,
norm_output
@
torch
.
compile
(
mode
=
"max-autotune-no-cudagraphs"
)
#
@torch.compile(mode="max-autotune-no-cudagraphs")
def
forward
(
self
,
hidden_states
,
attention_mask
,
encoder_output
=
None
,
enc_dec_attn_mask
=
None
,
retriever_input
=
None
,
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment