- 15 Dec, 2023 1 commit
-
-
comfyanonymous authored
-
- 12 Dec, 2023 1 commit
-
-
comfyanonymous authored
comfy.ops -> comfy.ops.disable_weight_init This should make it more clear what they actually do. Some unused code has also been removed.
-
- 07 Dec, 2023 1 commit
-
-
comfyanonymous authored
Use a simple CLIP model implementation instead of the one from transformers. This will allow some interesting things that would too hackish to implement using the transformers implementation.
-
- 05 Dec, 2023 1 commit
-
-
comfyanonymous authored
-
- 04 Dec, 2023 2 commits
-
-
comfyanonymous authored
--fp8_e4m3fn-unet and --fp8_e5m2-unet are the two different formats supported by pytorch.
-
comfyanonymous authored
-
- 26 Nov, 2023 2 commits
-
-
comfyanonymous authored
-
comfyanonymous authored
Now everything in transformer_options gets put in extra_options.
-
- 24 Nov, 2023 2 commits
-
-
comfyanonymous authored
-
comfyanonymous authored
-
- 21 Nov, 2023 1 commit
-
-
comfyanonymous authored
-
- 16 Nov, 2023 2 commits
-
-
comfyanonymous authored
-
comfyanonymous authored
-
- 14 Nov, 2023 1 commit
-
-
comfyanonymous authored
-
- 08 Nov, 2023 2 commits
-
-
comfyanonymous authored
-
comfyanonymous authored
-
- 07 Nov, 2023 1 commit
-
-
comfyanonymous authored
-
- 31 Oct, 2023 1 commit
-
-
comfyanonymous authored
DDIM is the same as euler with a small difference in the inpaint code. DDIM uses randn_like but I set a fixed seed instead. I'm keeping it in because I'm sure if I remove it people are going to complain.
-
- 30 Oct, 2023 2 commits
-
-
comfyanonymous authored
-
comfyanonymous authored
-
- 27 Oct, 2023 1 commit
-
-
comfyanonymous authored
-
- 26 Oct, 2023 1 commit
-
-
comfyanonymous authored
-
- 22 Oct, 2023 2 commits
-
-
comfyanonymous authored
-
comfyanonymous authored
-
- 21 Oct, 2023 1 commit
-
-
comfyanonymous authored
-
- 17 Oct, 2023 2 commits
-
-
comfyanonymous authored
-
comfyanonymous authored
-
- 16 Oct, 2023 1 commit
-
-
comfyanonymous authored
-
- 13 Oct, 2023 1 commit
-
-
comfyanonymous authored
-
- 12 Oct, 2023 3 commits
-
-
comfyanonymous authored
-
comfyanonymous authored
-
comfyanonymous authored
There's no reason for the whole CrossAttention object to be repeated when only the operation in the middle changes.
-
- 27 Sep, 2023 1 commit
-
-
comfyanonymous authored
-
- 26 Sep, 2023 1 commit
-
-
comfyanonymous authored
-
- 23 Sep, 2023 1 commit
-
-
comfyanonymous authored
-
- 15 Sep, 2023 1 commit
-
-
comfyanonymous authored
-
- 04 Sep, 2023 1 commit
-
-
comfyanonymous authored
-
- 03 Sep, 2023 2 commits
- 01 Sep, 2023 1 commit
-
-
comfyanonymous authored
-