Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
ModelZoo
ResNet50_tensorflow
Commits
a6bbd3fa
Commit
a6bbd3fa
authored
Mar 21, 2021
by
Le Hou
Committed by
A. Unique TensorFlower
Mar 21, 2021
Browse files
Minor readability improvements.
PiperOrigin-RevId: 364219111
parent
1e9cbdce
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
5 additions
and
2 deletions
+5
-2
official/nlp/modeling/layers/transformer_scaffold.py
official/nlp/modeling/layers/transformer_scaffold.py
+5
-2
No files found.
official/nlp/modeling/layers/transformer_scaffold.py
View file @
a6bbd3fa
...
...
@@ -112,8 +112,9 @@ class TransformerScaffold(tf.keras.layers.Layer):
self
.
_bias_constraint
=
tf
.
keras
.
constraints
.
get
(
bias_constraint
)
def
build
(
self
,
input_shape
):
input_tensor
=
input_shape
[
0
]
if
len
(
input_shape
)
==
2
else
input_shape
input_tensor_shape
=
tf
.
TensorShape
(
input_tensor
)
input_tensor_shape
=
input_shape
[
0
]
if
(
len
(
input_shape
)
==
2
)
else
input_shape
input_tensor_shape
=
tf
.
TensorShape
(
input_tensor_shape
)
if
len
(
input_tensor_shape
.
as_list
())
!=
3
:
raise
ValueError
(
"TransformerScaffold expects a three-dimensional input of "
...
...
@@ -170,6 +171,8 @@ class TransformerScaffold(tf.keras.layers.Layer):
else
:
self
.
_feedforward_block
=
None
# self._dropout_rate controls dropout rates at two places:
# after attention, and after FFN.
self
.
_attention_dropout
=
tf
.
keras
.
layers
.
Dropout
(
rate
=
self
.
_dropout_rate
)
# Use float32 in layernorm for numeric stability.
# It is probably safe in mixed_float16, but we haven't validated this yet.
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment