Commit 3661e106 authored by comfyanonymous's avatar comfyanonymous
Browse files

Add a command line option to disable upcasting in some cross attention ops.

parent 50db297c
...@@ -6,6 +6,10 @@ import threading ...@@ -6,6 +6,10 @@ import threading
import queue import queue
import traceback import traceback
if '--dont-upcast-attention' in sys.argv:
print("disabling upcasting of attention")
os.environ['ATTN_PRECISION'] = "fp16"
import torch import torch
import nodes import nodes
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment