1. 11 Jul, 2025 1 commit
  2. 10 Jul, 2025 2 commits
  3. 04 Jul, 2025 2 commits
    • Aryan's avatar
      Fix Wan AccVideo/CausVid fuse_lora (#11856) · 425a715e
      Aryan authored
      * fix
      
      * actually, better fix
      
      * empty commit; trigger tests again
      
      * mark wanvace test as flaky
      425a715e
    • Benjamin Bossan's avatar
      FIX set_lora_device when target layers differ (#11844) · 25279175
      Benjamin Bossan authored
      
      
      * FIX set_lora_device when target layers differ
      
      Resolves #11833
      
      Fixes a bug that occurs after calling set_lora_device when multiple LoRA
      adapters are loaded that target different layers.
      
      Note: Technically, the accompanying test does not require a GPU because
      the bug is triggered even if the parameters are already on the
      corresponding device, i.e. loading on CPU and then changing the device
      to CPU is sufficient to cause the bug. However, this may be optimized
      away in the future, so I decided to test with GPU.
      
      * Update docstring to warn about device mismatch
      
      * Extend docstring with an example
      
      * Fix docstring
      
      ---------
      Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
      25279175
  4. 02 Jul, 2025 2 commits
  5. 01 Jul, 2025 1 commit
  6. 30 Jun, 2025 2 commits
  7. 28 Jun, 2025 1 commit
  8. 27 Jun, 2025 1 commit
  9. 25 Jun, 2025 1 commit
  10. 24 Jun, 2025 2 commits
  11. 19 Jun, 2025 2 commits
  12. 17 Jun, 2025 1 commit
  13. 16 Jun, 2025 1 commit
  14. 14 Jun, 2025 1 commit
    • Edna's avatar
      Chroma Pipeline (#11698) · 8adc6003
      Edna authored
      
      
      * working state from hameerabbasi and iddl
      
      * working state form hameerabbasi and iddl (transformer)
      
      * working state (normalization)
      
      * working state (embeddings)
      
      * add chroma loader
      
      * add chroma to mappings
      
      * add chroma to transformer init
      
      * take out variant stuff
      
      * get decently far in changing variant stuff
      
      * add chroma init
      
      * make chroma output class
      
      * add chroma transformer to dummy tp
      
      * add chroma to init
      
      * add chroma to init
      
      * fix single file
      
      * update
      
      * update
      
      * add chroma to auto pipeline
      
      * add chroma to pipeline init
      
      * change to chroma transformer
      
      * take out variant from blocks
      
      * swap embedder location
      
      * remove prompt_2
      
      * work on swapping text encoders
      
      * remove mask function
      
      * dont modify mask (for now)
      
      * wrap attn mask
      
      * no attn mask (can't get it to work)
      
      * remove pooled prompt embeds
      
      * change to my own unpooled embeddeer
      
      * fix load
      
      * take pooled projections out of transformer
      
      * ensure correct dtype for chroma embeddings
      
      * update
      
      * use dn6 attn mask + fix true_cfg_scale
      
      * use chroma pipeline output
      
      * use DN6 embeddings
      
      * remove guidance
      
      * remove guidance embed (pipeline)
      
      * remove guidance from embeddings
      
      * don't return length
      
      * dont change dtype
      
      * remove unused stuff, fix up docs
      
      * add chroma autodoc
      
      * add .md (oops)
      
      * initial chroma docs
      
      * undo don't change dtype
      
      * undo arxiv change
      
      unsure why that happened
      
      * fix hf papers regression in more places
      
      * Update docs/source/en/api/pipelines/chroma.md
      Co-authored-by: default avatarDhruv Nair <dhruv.nair@gmail.com>
      
      * do_cfg -> self.do_classifier_free_guidance
      
      * Update docs/source/en/api/models/chroma_transformer.md
      Co-authored-by: default avatarDhruv Nair <dhruv.nair@gmail.com>
      
      * Update chroma.md
      
      * Move chroma layers into transformer
      
      * Remove pruned AdaLayerNorms
      
      * Add chroma fast tests
      
      * (untested) batch cond and uncond
      
      * Add # Copied from for shift
      
      * Update # Copied from statements
      
      * update norm imports
      
      * Revert cond + uncond batching
      
      * Add transformer tests
      
      * move chroma test (oops)
      
      * chroma init
      
      * fix chroma pipeline fast tests
      
      * Update src/diffusers/models/transformers/transformer_chroma.py
      Co-authored-by: default avatarDhruv Nair <dhruv.nair@gmail.com>
      
      * Move Approximator and Embeddings
      
      * Fix auto pipeline + make style, quality
      
      * make style
      
      * Apply style fixes
      
      * switch to new input ids
      
      * fix # Copied from error
      
      * remove # Copied from on protected members
      
      * try to fix import
      
      * fix import
      
      * make fix-copes
      
      * revert style fix
      
      * update chroma transformer params
      
      * update chroma transformer approximator init params
      
      * update to pad tokens
      
      * fix batch inference
      
      * Make more pipeline tests work
      
      * Make most transformer tests work
      
      * fix docs
      
      * make style, make quality
      
      * skip batch tests
      
      * fix test skipping
      
      * fix test skipping again
      
      * fix for tests
      
      * Fix all pipeline test
      
      * update
      
      * push local changes, fix docs
      
      * add encoder test, remove pooled dim
      
      * default proj dim
      
      * fix tests
      
      * fix equal size list input
      
      * update
      
      * push local changes, fix docs
      
      * add encoder test, remove pooled dim
      
      * default proj dim
      
      * fix tests
      
      * fix equal size list input
      
      * Revert "fix equal size list input"
      
      This reverts commit 3fe4ad67d58d83715bc238f8654f5e90bfc5653c.
      
      * update
      
      * update
      
      * update
      
      * update
      
      * update
      
      ---------
      Co-authored-by: default avatarDhruv Nair <dhruv.nair@gmail.com>
      Co-authored-by: default avatargithub-actions[bot] <github-actions[bot]@users.noreply.github.com>
      8adc6003
  15. 13 Jun, 2025 2 commits
  16. 11 Jun, 2025 1 commit
  17. 06 Jun, 2025 1 commit
    • Aryan's avatar
      Wan VACE (#11582) · 73a9d585
      Aryan authored
      * initial support
      
      * make fix-copies
      
      * fix no split modules
      
      * add conversion script
      
      * refactor
      
      * add pipeline test
      
      * refactor
      
      * fix bug with mask
      
      * fix for reference images
      
      * remove print
      
      * update docs
      
      * update slices
      
      * update
      
      * update
      
      * update example
      73a9d585
  18. 30 May, 2025 1 commit
  19. 27 May, 2025 1 commit
  20. 22 May, 2025 1 commit
  21. 20 May, 2025 1 commit
  22. 19 May, 2025 4 commits
  23. 15 May, 2025 1 commit
  24. 13 May, 2025 2 commits
  25. 09 May, 2025 1 commit
  26. 06 May, 2025 2 commits
  27. 01 May, 2025 1 commit
  28. 28 Apr, 2025 1 commit