Unverified Commit 325bf911 authored by Tim Moon's avatar Tim Moon Committed by GitHub
Browse files

[PyTorch] Disable large test cases for Transformer layer (#508)



Disable large test cases for PyTorch Transformer layer
Signed-off-by: default avatarTim Moon <tmoon@nvidia.com>
parent 30cad990
...@@ -415,6 +415,12 @@ def test_transformer_layer(dtype, bs, model, bias_type, fused_qkv_params, RoPE): ...@@ -415,6 +415,12 @@ def test_transformer_layer(dtype, bs, model, bias_type, fused_qkv_params, RoPE):
config = model_configs_lean[model] config = model_configs_lean[model]
tols = dict(atol=5e-1, rtol=5e-2) tols = dict(atol=5e-1, rtol=5e-2)
# TODO @cyanguwa: Handle test cases more cleanly
if config.hidden_size > 1024:
pytest.skip(
"Tolerances for test_transformer_layer are intended for small test cases"
)
# Skip if only unfused backend is supported # Skip if only unfused backend is supported
fused_attn_supported = _is_fused_attention_supported( fused_attn_supported = _is_fused_attention_supported(
config, config,
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment