"git@developer.sourcefind.cn:chenpangpang/transformers.git" did not exist on "f10b925015b03612877fc2213e118d9507dd3ff2"
Unverified Commit 6d02ca4b authored by marcmk6's avatar marcmk6 Committed by GitHub
Browse files

Fix issue of canine forward requiring input_ids anyway (#26290)

* fix issue of canine forward requires input_ids anyway

The `forward` requires `input_ids` for deriving other variables in all cases. Change this to use the given one between `input_ids` and `inputs_embeds`

* fix canine forward

The current `forward` requires (the shape of) `input_ids` for deriving other variables whenever `input_ids` or `inputs_embeds` is provided. Change this to use the given one instead of `input_ids` all the time.

* fix format

* fix format
parent 7d77d7f7
...@@ -1169,7 +1169,9 @@ class CanineModel(CaninePreTrainedModel): ...@@ -1169,7 +1169,9 @@ class CanineModel(CaninePreTrainedModel):
# Contextualize character embeddings using shallow Transformer. # Contextualize character embeddings using shallow Transformer.
# We use a 3D attention mask for the local attention. # We use a 3D attention mask for the local attention.
# `input_char_encoding`: shape (batch_size, char_seq_len, char_dim) # `input_char_encoding`: shape (batch_size, char_seq_len, char_dim)
char_attention_mask = self._create_3d_attention_mask_from_input_mask(input_ids, attention_mask) char_attention_mask = self._create_3d_attention_mask_from_input_mask(
input_ids if input_ids is not None else inputs_embeds, attention_mask
)
init_chars_encoder_outputs = self.initial_char_encoder( init_chars_encoder_outputs = self.initial_char_encoder(
input_char_embeddings, input_char_embeddings,
attention_mask=char_attention_mask, attention_mask=char_attention_mask,
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment