Unverified Commit 026097b9 authored by Funtowicz Morgan's avatar Funtowicz Morgan Committed by GitHub
Browse files

Ensure fast tokenizer can construct tensor without pad token if only one...

Ensure fast tokenizer can construct tensor without pad token if only one sample is provided. (#4201)
parent 0a6cbea0
......@@ -2435,7 +2435,7 @@ class PreTrainedTokenizerFast(PreTrainedTokenizer):
)
# Needed if we have to return a tensor
pad_to_max_length = pad_to_max_length or (return_tensors is not None)
pad_to_max_length = pad_to_max_length or (return_tensors is not None and len(batch_text_or_text_pairs) > 1)
# Throw an error if we can pad because there is no padding token
if pad_to_max_length and self.pad_token_id is None:
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment