Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
ModelZoo
ResNet50_tensorflow
Commits
002b4ec4
Commit
002b4ec4
authored
Dec 03, 2021
by
Yuexin Wu
Committed by
A. Unique TensorFlower
Dec 03, 2021
Browse files
Fix activation hinting.
PiperOrigin-RevId: 414032298
parent
a39f18f9
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
3 additions
and
1 deletion
+3
-1
official/nlp/modeling/networks/bert_encoder.py
official/nlp/modeling/networks/bert_encoder.py
+3
-1
No files found.
official/nlp/modeling/networks/bert_encoder.py
View file @
002b4ec4
...
@@ -23,6 +23,8 @@ from official.nlp.modeling import layers
...
@@ -23,6 +23,8 @@ from official.nlp.modeling import layers
_Initializer
=
Union
[
str
,
tf
.
keras
.
initializers
.
Initializer
]
_Initializer
=
Union
[
str
,
tf
.
keras
.
initializers
.
Initializer
]
_Activation
=
Union
[
str
,
Callable
[...,
Any
]]
_approx_gelu
=
lambda
x
:
tf
.
keras
.
activations
.
gelu
(
x
,
approximate
=
True
)
_approx_gelu
=
lambda
x
:
tf
.
keras
.
activations
.
gelu
(
x
,
approximate
=
True
)
...
@@ -83,7 +85,7 @@ class BertEncoderV2(tf.keras.layers.Layer):
...
@@ -83,7 +85,7 @@ class BertEncoderV2(tf.keras.layers.Layer):
max_sequence_length
:
int
=
512
,
max_sequence_length
:
int
=
512
,
type_vocab_size
:
int
=
16
,
type_vocab_size
:
int
=
16
,
inner_dim
:
int
=
3072
,
inner_dim
:
int
=
3072
,
inner_activation
:
Callable
[...,
Any
]
=
_approx_gelu
,
inner_activation
:
_Activation
=
_approx_gelu
,
output_dropout
:
float
=
0.1
,
output_dropout
:
float
=
0.1
,
attention_dropout
:
float
=
0.1
,
attention_dropout
:
float
=
0.1
,
initializer
:
_Initializer
=
tf
.
keras
.
initializers
.
TruncatedNormal
(
initializer
:
_Initializer
=
tf
.
keras
.
initializers
.
TruncatedNormal
(
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment