Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
ModelZoo
ResNet50_tensorflow
Commits
5a1bce51
Commit
5a1bce51
authored
Aug 27, 2020
by
Zhenyu Tan
Committed by
A. Unique TensorFlower
Aug 27, 2020
Browse files
Replace Transformer in seq2seqTransformer.
PiperOrigin-RevId: 328798553
parent
dd0126f9
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
7 additions
and
6 deletions
+7
-6
official/nlp/modeling/models/seq2seq_transformer.py
official/nlp/modeling/models/seq2seq_transformer.py
+7
-6
No files found.
official/nlp/modeling/models/seq2seq_transformer.py
View file @
5a1bce51
...
...
@@ -20,6 +20,7 @@ import math
import
tensorflow
as
tf
from
official.modeling
import
tf_utils
from
official.nlp.keras_nlp.layers
import
transformer_encoder_block
from
official.nlp.modeling
import
layers
from
official.nlp.modeling.ops
import
beam_search
from
official.nlp.transformer
import
metrics
...
...
@@ -471,16 +472,16 @@ class TransformerEncoder(tf.keras.layers.Layer):
self
.
encoder_layers
=
[]
for
i
in
range
(
self
.
num_layers
):
self
.
encoder_layers
.
append
(
layers
.
Transformer
(
transformer_encoder_block
.
TransformerEncoderBlock
(
num_attention_heads
=
self
.
num_attention_heads
,
in
t
er
mediate_size
=
self
.
_intermediate_size
,
in
t
er
mediate
_activation
=
self
.
_activation
,
dropout
_rate
=
self
.
_dropout_rate
,
attention_dropout
_rate
=
self
.
_attention_dropout_rate
,
in
n
er
_dim
=
self
.
_intermediate_size
,
in
n
er_activation
=
self
.
_activation
,
output_
dropout
=
self
.
_dropout_rate
,
attention_dropout
=
self
.
_attention_dropout_rate
,
use_bias
=
self
.
_use_bias
,
norm_first
=
self
.
_norm_first
,
norm_epsilon
=
self
.
_norm_epsilon
,
in
t
er
mediate
_dropout
=
self
.
_intermediate_dropout
,
in
n
er_dropout
=
self
.
_intermediate_dropout
,
attention_initializer
=
attention_initializer
(
input_shape
[
2
]),
name
=
(
"layer_%d"
%
i
)))
self
.
output_normalization
=
tf
.
keras
.
layers
.
LayerNormalization
(
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment