Commit d6af14b2 authored by huangwb's avatar huangwb
Browse files

fix fHAS_FLASH_ATTN_V2_ROCM flag bug for DCU

parent 5a1cf2f0
......@@ -45,7 +45,7 @@ if IS_CUDA_SYSTEM or IS_ROCM_SYSTEM:
"Use the official Docker image (ghcr.io/huggingface/text-generation-inference:latest) "
f"or install flash attention v2 with `cd server && make install install-flash-attention-v2{architecture_suffix}`"
)
if not (is_sm8x or is_sm90):
if not (is_sm8x or is_sm90) and IS_CUDA_SYSTEM:
raise ImportError(
f"GPU with CUDA capability {major} {minor} is not supported for "
"Flash Attention V2"
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment