Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
Uni-Core
Commits
af4f9088
Commit
af4f9088
authored
Aug 12, 2022
by
Guolin Ke
Browse files
support softmax in non-inplace cases
parent
49c9895b
Changes
2
Hide whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
5 additions
and
2 deletions
+5
-2
unicore/modules/multihead_attention.py
unicore/modules/multihead_attention.py
+1
-1
unicore/modules/softmax_dropout.py
unicore/modules/softmax_dropout.py
+4
-1
No files found.
unicore/modules/multihead_attention.py
View file @
af4f9088
...
...
@@ -99,7 +99,7 @@ class SelfMultiheadAttention(nn.Module):
else
:
attn_weights
+=
attn_bias
attn
=
softmax_dropout
(
attn_weights
,
self
.
dropout
,
self
.
training
,
attn_weights
,
self
.
dropout
,
self
.
training
,
inplace
=
False
,
)
o
=
torch
.
bmm
(
attn
,
v
)
...
...
unicore/modules/softmax_dropout.py
View file @
af4f9088
...
...
@@ -81,7 +81,7 @@ def _check_bias(bias, input):
prev_non_one
=
bias
.
shape
[
i
]
!=
1
def
softmax_dropout
(
input
,
dropout_prob
,
is_training
=
True
,
mask
=
None
,
bias
=
None
):
def
softmax_dropout
(
input
,
dropout_prob
,
is_training
=
True
,
mask
=
None
,
bias
=
None
,
inplace
=
True
):
"""softmax dropout, and mask, bias are optional.
Args:
input (torch.Tensor): input tensor
...
...
@@ -103,6 +103,9 @@ def softmax_dropout(input, dropout_prob, is_training=True, mask=None, bias=None)
_check_bias
(
bias
,
input
)
bias
=
bias
.
contiguous
().
view
(
-
1
,
input_size
[
-
2
],
input_size
[
-
1
])
input
=
input
.
view
(
-
1
,
input_size
[
-
2
],
input_size
[
-
1
])
if
not
inplace
:
# copy a input for non-inplace case
input
=
input
.
clone
()
if
dropout_prob
<=
0.0
or
input_size
[
-
1
]
<=
1024
:
return
SoftmaxDropoutFast
.
apply
(
is_training
,
input
,
mask
,
bias
,
dropout_prob
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment