Commit 321c5fa2 authored by comfyanonymous's avatar comfyanonymous
Browse files

Enable pytorch attention by default on xpu.

parent 0665749b
...@@ -165,6 +165,9 @@ try: ...@@ -165,6 +165,9 @@ try:
ENABLE_PYTORCH_ATTENTION = True ENABLE_PYTORCH_ATTENTION = True
if torch.cuda.is_bf16_supported(): if torch.cuda.is_bf16_supported():
VAE_DTYPE = torch.bfloat16 VAE_DTYPE = torch.bfloat16
if is_intel_xpu():
if args.use_split_cross_attention == False and args.use_quad_cross_attention == False:
ENABLE_PYTORCH_ATTENTION = True
except: except:
pass pass
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment