- 30 Jul, 2024 3 commits
-
-
comfyanonymous authored
-
comfyanonymous authored
-
comfyanonymous authored
This breaks seeds for resolutions that are not a multiple of 16 in pixel resolution by using circular padding instead of reflection padding but should lower the amount of artifacts when doing img2img at those resolutions.
-
- 24 Jul, 2024 1 commit
-
-
comfyanonymous authored
-
- 27 Jun, 2024 1 commit
-
-
comfyanonymous authored
Still missing the node to properly use it.
-
- 15 Jun, 2024 1 commit
-
-
comfyanonymous authored
-
- 12 Jun, 2024 1 commit
-
-
comfyanonymous authored
-
- 11 Jun, 2024 1 commit
-
-
Dango233 authored
-
- 10 Jun, 2024 1 commit
-
-
comfyanonymous authored
-
- 18 May, 2024 2 commits
-
-
comfyanonymous authored
-
comfyanonymous authored
-
- 14 May, 2024 1 commit
-
-
comfyanonymous authored
-
- 11 Mar, 2024 1 commit
-
-
comfyanonymous authored
-
- 29 Feb, 2024 1 commit
-
-
comfyanonymous authored
-
- 28 Feb, 2024 1 commit
-
-
comfyanonymous authored
Use the UNET Loader node to load the unet file to use them.
-
- 07 Feb, 2024 1 commit
-
-
comfyanonymous authored
-
- 26 Jan, 2024 1 commit
-
-
comfyanonymous authored
-
- 03 Jan, 2024 1 commit
-
-
comfyanonymous authored
-
- 02 Jan, 2024 1 commit
-
-
comfyanonymous authored
-
- 26 Dec, 2023 1 commit
-
-
comfyanonymous authored
-
- 24 Dec, 2023 1 commit
-
-
comfyanonymous authored
-
- 22 Dec, 2023 1 commit
-
-
comfyanonymous authored
-
- 12 Dec, 2023 1 commit
-
-
comfyanonymous authored
comfy.ops -> comfy.ops.disable_weight_init This should make it more clear what they actually do. Some unused code has also been removed.
-
- 04 Dec, 2023 2 commits
-
-
comfyanonymous authored
--fp8_e4m3fn-unet and --fp8_e5m2-unet are the two different formats supported by pytorch.
-
comfyanonymous authored
-
- 26 Nov, 2023 1 commit
-
-
comfyanonymous authored
Now everything in transformer_options gets put in extra_options.
-
- 24 Nov, 2023 1 commit
-
-
comfyanonymous authored
-
- 21 Nov, 2023 1 commit
-
-
comfyanonymous authored
-
- 16 Nov, 2023 2 commits
-
-
comfyanonymous authored
-
comfyanonymous authored
-
- 14 Nov, 2023 1 commit
-
-
comfyanonymous authored
-
- 08 Nov, 2023 2 commits
-
-
comfyanonymous authored
-
comfyanonymous authored
-
- 07 Nov, 2023 1 commit
-
-
comfyanonymous authored
-
- 27 Oct, 2023 1 commit
-
-
comfyanonymous authored
-
- 17 Oct, 2023 2 commits
-
-
comfyanonymous authored
-
comfyanonymous authored
-
- 13 Oct, 2023 1 commit
-
-
comfyanonymous authored
-
- 12 Oct, 2023 2 commits
-
-
comfyanonymous authored
-
comfyanonymous authored
There's no reason for the whole CrossAttention object to be repeated when only the operation in the middle changes.
-