Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
fd41e2da
Unverified
Commit
fd41e2da
authored
Jul 12, 2021
by
Lysandre Debut
Committed by
GitHub
Jul 12, 2021
Browse files
Pipeline should be agnostic (#12656)
parent
9b3aab2c
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
3 additions
and
1 deletion
+3
-1
tests/test_pipelines_question_answering.py
tests/test_pipelines_question_answering.py
+3
-1
No files found.
tests/test_pipelines_question_answering.py
View file @
fd41e2da
...
...
@@ -14,6 +14,7 @@
import
unittest
from
transformers
import
is_tf_available
,
is_torch_available
from
transformers.data.processors.squad
import
SquadExample
from
transformers.pipelines
import
Pipeline
,
QuestionAnsweringArgumentHandler
,
pipeline
from
transformers.testing_utils
import
slow
...
...
@@ -57,7 +58,7 @@ class QAPipelineTests(CustomInputPipelineCommonMixin, unittest.TestCase):
task
=
self
.
pipeline_task
,
model
=
model
,
tokenizer
=
model
,
framework
=
"pt"
,
framework
=
"pt"
if
is_torch_available
()
else
"tf"
,
**
self
.
pipeline_loading_kwargs
,
)
for
model
in
self
.
small_models
...
...
@@ -65,6 +66,7 @@ class QAPipelineTests(CustomInputPipelineCommonMixin, unittest.TestCase):
return
question_answering_pipelines
@
slow
@
unittest
.
skipIf
(
not
is_torch_available
()
and
not
is_tf_available
(),
"Either torch or TF must be installed."
)
def
test_high_topk_small_context
(
self
):
self
.
pipeline_running_kwargs
.
update
({
"topk"
:
20
})
valid_inputs
=
[
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment