Unverified Commit 49572942 authored by Yih-Dar's avatar Yih-Dar Committed by GitHub
Browse files

Fix flaky `test_for_warning_if_padding_and_no_attention_mask` (#24706)



fix
Co-authored-by: default avatarydshieh <ydshieh@users.noreply.github.com>
parent fb78769b
......@@ -584,6 +584,9 @@ class BertModelTest(ModelTesterMixin, GenerationTesterMixin, PipelineTesterMixin
# Check for warnings if the attention_mask is missing.
logger = logging.get_logger("transformers.modeling_utils")
# clear cache so we can test the warning is emitted (from `warning_once`).
logger.warning_once.cache_clear()
with CaptureLogger(logger) as cl:
model = BertModel(config=config)
model.to(torch_device)
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment