Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
7751be7c
"test/git@developer.sourcefind.cn:gaoqiong/migraphx.git" did not exist on "4315a991502585bf17c6635b18c9bb514dea5199"
Unverified
Commit
7751be7c
authored
May 11, 2020
by
theblackcat102
Committed by
GitHub
May 11, 2020
Browse files
fix reformer apex scaling issue (#4242)
parent
ac7d5f67
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
3 additions
and
3 deletions
+3
-3
src/transformers/modeling_reformer.py
src/transformers/modeling_reformer.py
+3
-3
No files found.
src/transformers/modeling_reformer.py
View file @
7751be7c
...
...
@@ -562,8 +562,8 @@ class LSHSelfAttention(nn.Module, EfficientAttentionMixin):
# get correct mask values depending on precision
if
query_key_dots
.
dtype
==
torch
.
float16
:
self_mask_value
=
self
.
self_mask_value_float16
mask_value
=
self
.
mask_value_float16
self_mask_value
=
self
.
self_mask_value_float16
.
half
()
mask_value
=
self
.
mask_value_float16
.
half
()
else
:
self_mask_value
=
self
.
self_mask_value_float32
mask_value
=
self
.
mask_value_float32
...
...
@@ -834,7 +834,7 @@ class LocalSelfAttention(nn.Module, EfficientAttentionMixin):
if
mask
is
not
None
:
# get mask tensor depending on half precision or not
if
query_key_dots
.
dtype
==
torch
.
float16
:
mask_value
=
self
.
mask_value_float16
mask_value
=
self
.
mask_value_float16
.
half
()
else
:
mask_value
=
self
.
mask_value_float32
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment