Unverified Commit 5f2a3d72 authored by Poedator's avatar Poedator Committed by GitHub
Browse files

fix deprecated ref to `tokenizer.max_len` (#10220)

This is to fix deprecated reference to `tokenizer.max_len` with `tokenizer.model_max_length` - similar to [issue 8739](https://github.com/huggingface/transformers/issues/8739) and [PR 8604](https://github.com/huggingface/transformers/pull/8604). 
Example [here](https://colab.research.google.com/gist/poedator/f8776349e5c625ce287fc6fcd312fa1e/tokenizer-max_len-error-in-transformers_glue.ipynb). The error happens when `glue_convert_examples_to_features` is called without `max_length` parameter specified. In that case line 119 with wrong reference gets called. This simple fix should  do it.
parent cdcdd5f0
...@@ -116,7 +116,7 @@ def _glue_convert_examples_to_features( ...@@ -116,7 +116,7 @@ def _glue_convert_examples_to_features(
output_mode=None, output_mode=None,
): ):
if max_length is None: if max_length is None:
max_length = tokenizer.max_len max_length = tokenizer.model_max_length
if task is not None: if task is not None:
processor = glue_processors[task]() processor = glue_processors[task]()
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment