Commit e8140fa9 authored by Chen Chen's avatar Chen Chen Committed by A. Unique TensorFlower
Browse files

Remove output_activation because it is not used anywhere.

PiperOrigin-RevId: 294813724
parent 35aa1f31
...@@ -33,9 +33,9 @@ class BertPretrainer(tf.keras.Model): ...@@ -33,9 +33,9 @@ class BertPretrainer(tf.keras.Model):
encoder as described in "BERT: Pre-training of Deep Bidirectional Transformers encoder as described in "BERT: Pre-training of Deep Bidirectional Transformers
for Language Understanding" (https://arxiv.org/abs/1810.04805). for Language Understanding" (https://arxiv.org/abs/1810.04805).
The BertTrainer allows a user to pass in a transformer stack, and instantiates The BertPretrainer allows a user to pass in a transformer stack, and
the masked language model and classification networks that are used to create instantiates the masked language model and classification networks that are
the training objectives. used to create the training objectives.
Attributes: Attributes:
network: A transformer network. This network should output a sequence output network: A transformer network. This network should output a sequence output
...@@ -56,7 +56,6 @@ class BertPretrainer(tf.keras.Model): ...@@ -56,7 +56,6 @@ class BertPretrainer(tf.keras.Model):
num_classes, num_classes,
num_token_predictions, num_token_predictions,
activation=None, activation=None,
output_activation=None,
initializer='glorot_uniform', initializer='glorot_uniform',
output='logits', output='logits',
**kwargs): **kwargs):
...@@ -66,7 +65,6 @@ class BertPretrainer(tf.keras.Model): ...@@ -66,7 +65,6 @@ class BertPretrainer(tf.keras.Model):
'num_classes': num_classes, 'num_classes': num_classes,
'num_token_predictions': num_token_predictions, 'num_token_predictions': num_token_predictions,
'activation': activation, 'activation': activation,
'output_activation': output_activation,
'initializer': initializer, 'initializer': initializer,
'output': output, 'output': output,
} }
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment