Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
TransformerEngine
Commits
bd0873af
Unverified
Commit
bd0873af
authored
Dec 13, 2023
by
Marks101
Committed by
GitHub
Dec 13, 2023
Browse files
[PyTorch] fix attn_mask_type for inter_attention (#565)
Signed-off-by:
Markus Schnoes
<
markus.schnoes@gmx.de
>
parent
acd811aa
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
0 additions
and
1 deletion
+0
-1
transformer_engine/pytorch/transformer.py
transformer_engine/pytorch/transformer.py
+0
-1
No files found.
transformer_engine/pytorch/transformer.py
View file @
bd0873af
...
@@ -619,7 +619,6 @@ class TransformerLayer(torch.nn.Module):
...
@@ -619,7 +619,6 @@ class TransformerLayer(torch.nn.Module):
inter_attention_outputs
=
self
.
inter_attention
(
inter_attention_outputs
=
self
.
inter_attention
(
hidden_states
,
hidden_states
,
attention_mask
=
enc_dec_attn_mask
,
attention_mask
=
enc_dec_attn_mask
,
attn_mask_type
=
self_attn_mask_type
,
encoder_output
=
encoder_output
,
encoder_output
=
encoder_output
,
is_first_microbatch
=
is_first_microbatch
,
is_first_microbatch
=
is_first_microbatch
,
checkpoint_core_attention
=
checkpoint_core_attention
,
checkpoint_core_attention
=
checkpoint_core_attention
,
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment