- 24 Jul, 2024 1 commit
-
-
comfyanonymous authored
-
- 11 Jul, 2024 1 commit
-
-
comfyanonymous authored
Text encoders can now return other values to the CONDITIONING than the cond and pooled output.
-
- 10 Jul, 2024 1 commit
-
-
comfyanonymous authored
-
- 08 Jul, 2024 1 commit
-
-
comfyanonymous authored
-
- 06 Jul, 2024 2 commits
-
-
comfyanonymous authored
-
comfyanonymous authored
-
- 19 Jun, 2024 1 commit
-
-
Mario Klingemann authored
Made token instance check more flexible so it also works with integers from numpy arrays or long tensors
-
- 11 Jun, 2024 1 commit
-
-
comfyanonymous authored
-
- 09 Jun, 2024 1 commit
-
-
comfyanonymous authored
-
- 07 Jun, 2024 1 commit
-
-
comfyanonymous authored
-
- 10 Mar, 2024 1 commit
-
-
comfyanonymous authored
-
- 27 Feb, 2024 1 commit
-
-
comfyanonymous authored
-
- 25 Feb, 2024 3 commits
-
-
comfyanonymous authored
-
comfyanonymous authored
-
comfyanonymous authored
Fix issue with not loading the SSD1B clip correctly.
-
- 16 Feb, 2024 1 commit
-
-
comfyanonymous authored
-
- 22 Jan, 2024 1 commit
-
-
comfyanonymous authored
-
- 11 Dec, 2023 1 commit
-
-
comfyanonymous authored
Use fp16 text encoder weights for CPU inference to lower memory usage.
-
- 08 Dec, 2023 1 commit
-
-
comfyanonymous authored
-
- 07 Dec, 2023 1 commit
-
-
comfyanonymous authored
Use a simple CLIP model implementation instead of the one from transformers. This will allow some interesting things that would too hackish to implement using the transformers implementation.
-
- 04 Dec, 2023 1 commit
-
-
comfyanonymous authored
-
- 14 Nov, 2023 2 commits
-
-
comfyanonymous authored
-
Jianqi Pan authored
-
- 06 Nov, 2023 2 commits
-
-
comfyanonymous authored
More generic clip model class that can be used on more types of text encoders. Don't apply weighting algorithm when weight is 1.0 Don't compute an empty token output when it's not needed.
-
comfyanonymous authored
-
- 28 Oct, 2023 1 commit
-
-
comfyanonymous authored
-
- 27 Oct, 2023 2 commits
-
-
comfyanonymous authored
-
comfyanonymous authored
-
- 15 Sep, 2023 1 commit
-
-
comfyanonymous authored
-
- 12 Sep, 2023 1 commit
-
-
comfyanonymous authored
-
- 25 Aug, 2023 1 commit
-
-
comfyanonymous authored
-
- 24 Aug, 2023 2 commits
-
-
comfyanonymous authored
-
comfyanonymous authored
-
- 23 Aug, 2023 1 commit
-
-
comfyanonymous authored
Do inference in fp32 to make sure quality stays the exact same.
-
- 04 Aug, 2023 1 commit
-
-
comfyanonymous authored
-
- 15 Jul, 2023 1 commit
-
-
comfyanonymous authored
-
- 12 Jul, 2023 1 commit
-
-
comfyanonymous authored
-
- 10 Jul, 2023 1 commit
-
-
comfyanonymous authored
-
- 06 Jul, 2023 1 commit
-
-
comfyanonymous authored
-
- 01 Jul, 2023 1 commit
-
-
comfyanonymous authored
-