"vscode:/vscode.git/clone" did not exist on "c789d64b88825be06d889c87ed8d3c2c70921f3c"
Commit a8ac58ea authored by Hongkun Yu's avatar Hongkun Yu Committed by A. Unique TensorFlower
Browse files

Update comments and users should use BertPretrainerV2 in the future.

PiperOrigin-RevId: 344194437
parent d2aac5d7
......@@ -28,11 +28,9 @@ from official.nlp.modeling import networks
@tf.keras.utils.register_keras_serializable(package='Text')
class BertPretrainer(tf.keras.Model):
"""BERT network training model.
"""BERT pretraining model.
This is an implementation of the network structure surrounding a transformer
encoder as described in "BERT: Pre-training of Deep Bidirectional Transformers
for Language Understanding" (https://arxiv.org/abs/1810.04805).
[Note] Please use the new BertPretrainerV2 for your projects.
The BertPretrainer allows a user to pass in a transformer stack, and
instantiates the masked language model and classification networks that are
......@@ -159,7 +157,6 @@ class BertPretrainer(tf.keras.Model):
return cls(**config)
# TODO(hongkuny): Migrate to BertPretrainerV2 for all usages.
@tf.keras.utils.register_keras_serializable(package='Text')
@gin.configurable
class BertPretrainerV2(tf.keras.Model):
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment