Unverified Commit e47765d8 authored by Patrick von Platen's avatar Patrick von Platen Committed by GitHub
Browse files

Fix head masking generate tests (#12110)

* fix_torch_device_generate_test

* remove @

* fix tests
parent d2753dcb
...@@ -1078,7 +1078,7 @@ class GenerationTesterMixin: ...@@ -1078,7 +1078,7 @@ class GenerationTesterMixin:
attention_names = ["encoder_attentions", "decoder_attentions", "cross_attentions"] attention_names = ["encoder_attentions", "decoder_attentions", "cross_attentions"]
for model_class in self.all_generative_model_classes: for model_class in self.all_generative_model_classes:
config, input_ids, attention_mask, max_length = self._get_input_ids_and_config() config, input_ids, attention_mask, max_length = self._get_input_ids_and_config()
model = model_class(config) model = model_class(config).to(torch_device)
# We want to test only encoder-decoder models # We want to test only encoder-decoder models
if not config.is_encoder_decoder: if not config.is_encoder_decoder:
continue continue
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment