"docs/vscode:/vscode.git/clone" did not exist on "2067862d1b874704ff5e88e65c515a7ff062f85e"
Commit e8140fa9 authored by Chen Chen's avatar Chen Chen Committed by A. Unique TensorFlower
Browse files

Remove output_activation because it is not used anywhere.

PiperOrigin-RevId: 294813724
parent 35aa1f31
......@@ -33,9 +33,9 @@ class BertPretrainer(tf.keras.Model):
encoder as described in "BERT: Pre-training of Deep Bidirectional Transformers
for Language Understanding" (https://arxiv.org/abs/1810.04805).
The BertTrainer allows a user to pass in a transformer stack, and instantiates
the masked language model and classification networks that are used to create
the training objectives.
The BertPretrainer allows a user to pass in a transformer stack, and
instantiates the masked language model and classification networks that are
used to create the training objectives.
Attributes:
network: A transformer network. This network should output a sequence output
......@@ -56,7 +56,6 @@ class BertPretrainer(tf.keras.Model):
num_classes,
num_token_predictions,
activation=None,
output_activation=None,
initializer='glorot_uniform',
output='logits',
**kwargs):
......@@ -66,7 +65,6 @@ class BertPretrainer(tf.keras.Model):
'num_classes': num_classes,
'num_token_predictions': num_token_predictions,
'activation': activation,
'output_activation': output_activation,
'initializer': initializer,
'output': output,
}
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment