Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
ModelZoo
ResNet50_tensorflow
Commits
c1ac2bfc
Commit
c1ac2bfc
authored
Nov 18, 2019
by
A. Unique TensorFlower
Browse files
Internal change
PiperOrigin-RevId: 281086898
parent
5a3b762c
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
5 additions
and
3 deletions
+5
-3
official/transformer/model/model_utils.py
official/transformer/model/model_utils.py
+5
-3
No files found.
official/transformer/model/model_utils.py
View file @
c1ac2bfc
...
...
@@ -89,7 +89,7 @@ def get_padding(x, padding_value=0, dtype=tf.float32):
Args:
x: int tensor with any shape
padding_value: int
value tha
t
padding_value: int
which represents padded values in inpu
t
dtype: The dtype of the return value.
Returns:
...
...
@@ -100,7 +100,7 @@ def get_padding(x, padding_value=0, dtype=tf.float32):
return
tf
.
cast
(
tf
.
equal
(
x
,
padding_value
),
dtype
)
def
get_padding_bias
(
x
):
def
get_padding_bias
(
x
,
padding_value
=
0
,
dtype
=
tf
.
float32
):
"""Calculate bias tensor from padding values in tensor.
Bias tensor that is added to the pre-softmax multi-headed attention logits,
...
...
@@ -109,12 +109,14 @@ def get_padding_bias(x):
Args:
x: int tensor with shape [batch_size, length]
padding_value: int which represents padded values in input
dtype: The dtype of the return value
Returns:
Attention bias tensor of shape [batch_size, 1, 1, length].
"""
with
tf
.
name_scope
(
"attention_bias"
):
padding
=
get_padding
(
x
)
padding
=
get_padding
(
x
,
padding_value
,
dtype
)
attention_bias
=
padding
*
_NEG_INF_FP32
attention_bias
=
tf
.
expand_dims
(
tf
.
expand_dims
(
attention_bias
,
axis
=
1
),
axis
=
1
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment