Unverified Commit 1c9d1f4c authored by Nicolas Patry's avatar Nicolas Patry Committed by GitHub
Browse files

Updating the docs for `max_seq_len` in QA pipeline (#17316)

parent 60ad7344
...@@ -228,8 +228,8 @@ class QuestionAnsweringPipeline(ChunkPipeline): ...@@ -228,8 +228,8 @@ class QuestionAnsweringPipeline(ChunkPipeline):
max_answer_len (`int`, *optional*, defaults to 15): max_answer_len (`int`, *optional*, defaults to 15):
The maximum length of predicted answers (e.g., only answers with a shorter length are considered). The maximum length of predicted answers (e.g., only answers with a shorter length are considered).
max_seq_len (`int`, *optional*, defaults to 384): max_seq_len (`int`, *optional*, defaults to 384):
The maximum length of the total sentence (context + question) after tokenization. The context will be The maximum length of the total sentence (context + question) in tokens of each chunk passed to the
split in several chunks (using `doc_stride`) if needed. model. The context will be split in several chunks (using `doc_stride` as overlap) if needed.
max_question_len (`int`, *optional*, defaults to 64): max_question_len (`int`, *optional*, defaults to 64):
The maximum length of the question after tokenization. It will be truncated if needed. The maximum length of the question after tokenization. It will be truncated if needed.
handle_impossible_answer (`bool`, *optional*, defaults to `False`): handle_impossible_answer (`bool`, *optional*, defaults to `False`):
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment