Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
ModelZoo
ResNet50_tensorflow
Commits
593dde9a
Commit
593dde9a
authored
Sep 18, 2020
by
Zhenyu Tan
Committed by
A. Unique TensorFlower
Sep 18, 2020
Browse files
Internal Cleanup.
PiperOrigin-RevId: 332557462
parent
d4a6670a
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
2 additions
and
2 deletions
+2
-2
official/nlp/keras_nlp/README.md
official/nlp/keras_nlp/README.md
+2
-2
No files found.
official/nlp/keras_nlp/README.md
View file @
593dde9a
...
@@ -19,7 +19,7 @@ assemble new layers, networks, or models.
...
@@ -19,7 +19,7 @@ assemble new layers, networks, or models.
*
[
SelfAttentionMask
](
layers/self_attention_mask.py
)
creates a 3D attention
*
[
SelfAttentionMask
](
layers/self_attention_mask.py
)
creates a 3D attention
mask from a 2D tensor mask.
mask from a 2D tensor mask.
*
[
`
MaskedLM
`
](
layers/masked_lm.py
)
implements a masked language model. It
*
[
MaskedLM
](
layers/masked_lm.py
)
implements a masked language model. It
assumes the embedding table variable is passed to it.
assumes the embedding table variable is passed to it.
...
@@ -30,7 +30,7 @@ sub-units of models that would not be trained alone. It encapsulates common
...
@@ -30,7 +30,7 @@ sub-units of models that would not be trained alone. It encapsulates common
network structures like a classification head or a transformer encoder into an
network structures like a classification head or a transformer encoder into an
easily handled object with a standardized configuration.
easily handled object with a standardized configuration.
*
[
`
BertEncoder
`
](
encoders/bert_encoder.py
)
implements a bi-directional
*
[
BertEncoder
](
encoders/bert_encoder.py
)
implements a bi-directional
Transformer-based encoder as described in
Transformer-based encoder as described in
[
"BERT: Pre-training of Deep Bidirectional Transformers for Language
[
"BERT: Pre-training of Deep Bidirectional Transformers for Language
Understanding"
](
https://arxiv.org/abs/1810.04805
)
. It includes the embedding
Understanding"
](
https://arxiv.org/abs/1810.04805
)
. It includes the embedding
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment