Commit c9e4db9c authored by A. Unique TensorFlower's avatar A. Unique TensorFlower
Browse files

Update BertEncoderV2 max_sequence_length docstring

PiperOrigin-RevId: 460912624
parent 535b1e7e
......@@ -49,8 +49,7 @@ class BertEncoderV2(tf.keras.layers.Layer):
num_attention_heads: The number of attention heads for each transformer. The
hidden size must be divisible by the number of attention heads.
max_sequence_length: The maximum sequence length that this encoder can
consume. If None, max_sequence_length uses the value from sequence length.
This determines the variable shape for positional embeddings.
consume. This determines the variable shape for positional embeddings.
type_vocab_size: The number of types that the 'type_ids' input can take.
inner_dim: The output dimension of the first Dense layer in a two-layer
feedforward network for each transformer.
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment