"docs/source/en/tf_xla.md" did not exist on "cf028d0c3d2e7e21cd56cc4958aee09798ee743e"
Unverified Commit 2958b55f authored by Gregory's avatar Gregory Committed by GitHub
Browse files

Removing one of the twice defined position_embeddings in LongFormer (#23343)

Removing twice defined position_embeddings

The self.position_embeddings in LongFormerEmbeddings is defined twice.
Removing the first with padding_idx
parent cf11493d
......@@ -438,7 +438,6 @@ class LongformerEmbeddings(nn.Module):
def __init__(self, config):
super().__init__()
self.word_embeddings = nn.Embedding(config.vocab_size, config.hidden_size, padding_idx=config.pad_token_id)
self.position_embeddings = nn.Embedding(config.max_position_embeddings, config.hidden_size)
self.token_type_embeddings = nn.Embedding(config.type_vocab_size, config.hidden_size)
# self.LayerNorm is not snake-cased to stick with TensorFlow model variable name and be able to load
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment