Tokenizer fast warnings (#2922)
* Remove warning when pad_to_max_length is not set. Signed-off-by:Morgan Funtowicz <morgan@huggingface.co> * Move RoberTa warning to RoberTa and not GPT2 base tokenizer. Signed-off-by:
Morgan Funtowicz <morgan@huggingface.co>
Showing
Please register or sign in to comment