Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
Megatron-LM
Commits
c5369391
Commit
c5369391
authored
Jan 16, 2025
by
wxj
Browse files
Update transformer.py, 关闭整个模型的torch.compile
parent
473449d8
Pipeline
#2221
failed with stages
in 0 seconds
Changes
1
Pipelines
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
2 additions
and
2 deletions
+2
-2
megatron/legacy/model/transformer.py
megatron/legacy/model/transformer.py
+2
-2
No files found.
megatron/legacy/model/transformer.py
View file @
c5369391
...
...
@@ -165,7 +165,7 @@ class ParallelMLP(MegatronModule):
is_expert
=
is_expert
,
)
@
torch
.
compile
(
mode
=
"max-autotune-no-cudagraphs"
)
#
@torch.compile(mode="max-autotune-no-cudagraphs")
def
forward
(
self
,
hidden_states
):
# [s, b, 4hp]
...
...
@@ -1213,7 +1213,7 @@ class ParallelTransformerLayer(MegatronModule):
return
retriever_output
,
norm_input
,
norm_output
@
torch
.
compile
(
mode
=
"max-autotune-no-cudagraphs"
)
#
@torch.compile(mode="max-autotune-no-cudagraphs")
def
forward
(
self
,
hidden_states
,
attention_mask
,
encoder_output
=
None
,
enc_dec_attn_mask
=
None
,
retriever_input
=
None
,
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment