Commit fe0f552e authored by Morgan Funtowicz's avatar Morgan Funtowicz
Browse files

Use attention_mask everywhere.

parent 348e19aa
......@@ -154,9 +154,6 @@ class QuestionAnsweringPipeline(Pipeline):
return_attention_masks=True, return_input_lengths=False
)
# TODO : Harmonize model arguments across all model
inputs['attention_mask'] = inputs.pop('encoder_attention_mask')
if is_tf_available():
# TODO trace model
start, end = self.model(inputs)
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment