Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
ModelZoo
ResNet50_tensorflow
Commits
7e47cd7b
Commit
7e47cd7b
authored
Jul 16, 2020
by
Hongkun Yu
Committed by
A. Unique TensorFlower
Jul 16, 2020
Browse files
Adds clear documentation: Functional/Subclass API used for each network/model.
PiperOrigin-RevId: 321591514
parent
982f457a
Changes
11
Show whitespace changes
Inline
Side-by-side
Showing
11 changed files
with
31 additions
and
1 deletion
+31
-1
official/nlp/modeling/models/bert_classifier.py
official/nlp/modeling/models/bert_classifier.py
+3
-0
official/nlp/modeling/models/bert_pretrainer.py
official/nlp/modeling/models/bert_pretrainer.py
+3
-0
official/nlp/modeling/models/bert_span_labeler.py
official/nlp/modeling/models/bert_span_labeler.py
+4
-1
official/nlp/modeling/models/bert_token_classifier.py
official/nlp/modeling/models/bert_token_classifier.py
+3
-0
official/nlp/modeling/models/electra_pretrainer.py
official/nlp/modeling/models/electra_pretrainer.py
+3
-0
official/nlp/modeling/networks/albert_transformer_encoder.py
official/nlp/modeling/networks/albert_transformer_encoder.py
+2
-0
official/nlp/modeling/networks/classification.py
official/nlp/modeling/networks/classification.py
+3
-0
official/nlp/modeling/networks/encoder_scaffold.py
official/nlp/modeling/networks/encoder_scaffold.py
+3
-0
official/nlp/modeling/networks/span_labeling.py
official/nlp/modeling/networks/span_labeling.py
+2
-0
official/nlp/modeling/networks/token_classification.py
official/nlp/modeling/networks/token_classification.py
+2
-0
official/nlp/modeling/networks/transformer_encoder.py
official/nlp/modeling/networks/transformer_encoder.py
+3
-0
No files found.
official/nlp/modeling/models/bert_classifier.py
View file @
7e47cd7b
...
@@ -37,6 +37,9 @@ class BertClassifier(tf.keras.Model):
...
@@ -37,6 +37,9 @@ class BertClassifier(tf.keras.Model):
instantiates a classification network based on the passed `num_classes`
instantiates a classification network based on the passed `num_classes`
argument. If `num_classes` is set to 1, a regression network is instantiated.
argument. If `num_classes` is set to 1, a regression network is instantiated.
*Note* that the model is constructed by
[Keras Functional API](https://keras.io/guides/functional_api/).
Arguments:
Arguments:
network: A transformer network. This network should output a sequence output
network: A transformer network. This network should output a sequence output
and a classification output. Furthermore, it should expose its embedding
and a classification output. Furthermore, it should expose its embedding
...
...
official/nlp/modeling/models/bert_pretrainer.py
View file @
7e47cd7b
...
@@ -41,6 +41,9 @@ class BertPretrainer(tf.keras.Model):
...
@@ -41,6 +41,9 @@ class BertPretrainer(tf.keras.Model):
instantiates the masked language model and classification networks that are
instantiates the masked language model and classification networks that are
used to create the training objectives.
used to create the training objectives.
*Note* that the model is constructed by
[Keras Functional API](https://keras.io/guides/functional_api/).
Arguments:
Arguments:
network: A transformer network. This network should output a sequence output
network: A transformer network. This network should output a sequence output
and a classification output.
and a classification output.
...
...
official/nlp/modeling/models/bert_span_labeler.py
View file @
7e47cd7b
...
@@ -32,9 +32,12 @@ class BertSpanLabeler(tf.keras.Model):
...
@@ -32,9 +32,12 @@ class BertSpanLabeler(tf.keras.Model):
encoder as described in "BERT: Pre-training of Deep Bidirectional Transformers
encoder as described in "BERT: Pre-training of Deep Bidirectional Transformers
for Language Understanding" (https://arxiv.org/abs/1810.04805).
for Language Understanding" (https://arxiv.org/abs/1810.04805).
The BertSpanLabeler allows a user to pass in a transformer
stack
, and
The BertSpanLabeler allows a user to pass in a transformer
encoder
, and
instantiates a span labeling network based on a single dense layer.
instantiates a span labeling network based on a single dense layer.
*Note* that the model is constructed by
[Keras Functional API](https://keras.io/guides/functional_api/).
Arguments:
Arguments:
network: A transformer network. This network should output a sequence output
network: A transformer network. This network should output a sequence output
and a classification output. Furthermore, it should expose its embedding
and a classification output. Furthermore, it should expose its embedding
...
...
official/nlp/modeling/models/bert_token_classifier.py
View file @
7e47cd7b
...
@@ -36,6 +36,9 @@ class BertTokenClassifier(tf.keras.Model):
...
@@ -36,6 +36,9 @@ class BertTokenClassifier(tf.keras.Model):
instantiates a token classification network based on the passed `num_classes`
instantiates a token classification network based on the passed `num_classes`
argument.
argument.
*Note* that the model is constructed by
[Keras Functional API](https://keras.io/guides/functional_api/).
Arguments:
Arguments:
network: A transformer network. This network should output a sequence output
network: A transformer network. This network should output a sequence output
and a classification output. Furthermore, it should expose its embedding
and a classification output. Furthermore, it should expose its embedding
...
...
official/nlp/modeling/models/electra_pretrainer.py
View file @
7e47cd7b
...
@@ -39,6 +39,9 @@ class ElectraPretrainer(tf.keras.Model):
...
@@ -39,6 +39,9 @@ class ElectraPretrainer(tf.keras.Model):
model (at generator side) and classification networks (at discriminator side)
model (at generator side) and classification networks (at discriminator side)
that are used to create the training objectives.
that are used to create the training objectives.
*Note* that the model is constructed by Keras Subclass API, where layers are
defined inside __init__ and call() implements the computation.
Arguments:
Arguments:
generator_network: A transformer network for generator, this network should
generator_network: A transformer network for generator, this network should
output a sequence output and an optional classification output.
output a sequence output and an optional classification output.
...
...
official/nlp/modeling/networks/albert_transformer_encoder.py
View file @
7e47cd7b
...
@@ -40,6 +40,8 @@ class AlbertTransformerEncoder(tf.keras.Model):
...
@@ -40,6 +40,8 @@ class AlbertTransformerEncoder(tf.keras.Model):
The default values for this object are taken from the ALBERT-Base
The default values for this object are taken from the ALBERT-Base
implementation described in the paper.
implementation described in the paper.
*Note* that the network is constructed by Keras Functional API.
Arguments:
Arguments:
vocab_size: The size of the token vocabulary.
vocab_size: The size of the token vocabulary.
embedding_width: The width of the word embeddings. If the embedding width is
embedding_width: The width of the word embeddings. If the embedding width is
...
...
official/nlp/modeling/networks/classification.py
View file @
7e47cd7b
...
@@ -29,6 +29,9 @@ class Classification(tf.keras.Model):
...
@@ -29,6 +29,9 @@ class Classification(tf.keras.Model):
This network implements a simple classifier head based on a dense layer. If
This network implements a simple classifier head based on a dense layer. If
num_classes is one, it can be considered as a regression problem.
num_classes is one, it can be considered as a regression problem.
*Note* that the network is constructed by
[Keras Functional API](https://keras.io/guides/functional_api/).
Arguments:
Arguments:
input_width: The innermost dimension of the input tensor to this network.
input_width: The innermost dimension of the input tensor to this network.
num_classes: The number of classes that this network should classify to. If
num_classes: The number of classes that this network should classify to. If
...
...
official/nlp/modeling/networks/encoder_scaffold.py
View file @
7e47cd7b
...
@@ -49,6 +49,9 @@ class EncoderScaffold(tf.keras.Model):
...
@@ -49,6 +49,9 @@ class EncoderScaffold(tf.keras.Model):
If the hidden_cls is not overridden, a default transformer layer will be
If the hidden_cls is not overridden, a default transformer layer will be
instantiated.
instantiated.
*Note* that the network is constructed by
[Keras Functional API](https://keras.io/guides/functional_api/).
Arguments:
Arguments:
pooled_output_dim: The dimension of pooled output.
pooled_output_dim: The dimension of pooled output.
pooler_layer_initializer: The initializer for the classification
pooler_layer_initializer: The initializer for the classification
...
...
official/nlp/modeling/networks/span_labeling.py
View file @
7e47cd7b
...
@@ -27,6 +27,8 @@ class SpanLabeling(tf.keras.Model):
...
@@ -27,6 +27,8 @@ class SpanLabeling(tf.keras.Model):
"""Span labeling network head for BERT modeling.
"""Span labeling network head for BERT modeling.
This network implements a simple single-span labeler based on a dense layer.
This network implements a simple single-span labeler based on a dense layer.
*Note* that the network is constructed by
[Keras Functional API](https://keras.io/guides/functional_api/).
Arguments:
Arguments:
input_width: The innermost dimension of the input tensor to this network.
input_width: The innermost dimension of the input tensor to this network.
...
...
official/nlp/modeling/networks/token_classification.py
View file @
7e47cd7b
...
@@ -27,6 +27,8 @@ class TokenClassification(tf.keras.Model):
...
@@ -27,6 +27,8 @@ class TokenClassification(tf.keras.Model):
"""TokenClassification network head for BERT modeling.
"""TokenClassification network head for BERT modeling.
This network implements a simple token classifier head based on a dense layer.
This network implements a simple token classifier head based on a dense layer.
*Note* that the network is constructed by
[Keras Functional API](https://keras.io/guides/functional_api/).
Arguments:
Arguments:
input_width: The innermost dimension of the input tensor to this network.
input_width: The innermost dimension of the input tensor to this network.
...
...
official/nlp/modeling/networks/transformer_encoder.py
View file @
7e47cd7b
...
@@ -39,6 +39,9 @@ class TransformerEncoder(tf.keras.Model):
...
@@ -39,6 +39,9 @@ class TransformerEncoder(tf.keras.Model):
in "BERT: Pre-training of Deep Bidirectional Transformers for Language
in "BERT: Pre-training of Deep Bidirectional Transformers for Language
Understanding".
Understanding".
*Note* that the network is constructed by
[Keras Functional API](https://keras.io/guides/functional_api/).
Arguments:
Arguments:
vocab_size: The size of the token vocabulary.
vocab_size: The size of the token vocabulary.
hidden_size: The size of the transformer hidden layers.
hidden_size: The size of the transformer hidden layers.
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment