Commit a8ac58ea authored by Hongkun Yu's avatar Hongkun Yu Committed by A. Unique TensorFlower
Browse files

Update comments and users should use BertPretrainerV2 in the future.

PiperOrigin-RevId: 344194437
parent d2aac5d7
...@@ -28,11 +28,9 @@ from official.nlp.modeling import networks ...@@ -28,11 +28,9 @@ from official.nlp.modeling import networks
@tf.keras.utils.register_keras_serializable(package='Text') @tf.keras.utils.register_keras_serializable(package='Text')
class BertPretrainer(tf.keras.Model): class BertPretrainer(tf.keras.Model):
"""BERT network training model. """BERT pretraining model.
This is an implementation of the network structure surrounding a transformer [Note] Please use the new BertPretrainerV2 for your projects.
encoder as described in "BERT: Pre-training of Deep Bidirectional Transformers
for Language Understanding" (https://arxiv.org/abs/1810.04805).
The BertPretrainer allows a user to pass in a transformer stack, and The BertPretrainer allows a user to pass in a transformer stack, and
instantiates the masked language model and classification networks that are instantiates the masked language model and classification networks that are
...@@ -159,7 +157,6 @@ class BertPretrainer(tf.keras.Model): ...@@ -159,7 +157,6 @@ class BertPretrainer(tf.keras.Model):
return cls(**config) return cls(**config)
# TODO(hongkuny): Migrate to BertPretrainerV2 for all usages.
@tf.keras.utils.register_keras_serializable(package='Text') @tf.keras.utils.register_keras_serializable(package='Text')
@gin.configurable @gin.configurable
class BertPretrainerV2(tf.keras.Model): class BertPretrainerV2(tf.keras.Model):
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment