1. 10 Nov, 2025 2 commits
  2. 06 Nov, 2025 1 commit
  3. 04 Nov, 2025 2 commits
  4. 28 Oct, 2025 3 commits
    • galbria's avatar
      Bria fibo (#12545) · 84e16575
      galbria authored
      
      
      * Bria FIBO pipeline
      
      * style fixs
      
      * fix CR
      
      * Refactor BriaFibo classes and update pipeline parameters
      
      - Updated BriaFiboAttnProcessor and BriaFiboAttention classes to reflect changes from Flux equivalents.
      - Modified the _unpack_latents method in BriaFiboPipeline to improve clarity.
      - Increased the default max_sequence_length to 3000 and added a new optional parameter do_patching.
      - Cleaned up test_pipeline_bria_fibo.py by removing unused imports and skipping unsupported tests.
      
      * edit the docs of FIBO
      
      * Remove unused BriaFibo imports and update CPU offload method in BriaFiboPipeline
      
      * Refactor FIBO classes to BriaFibo naming convention
      
      - Updated class names from FIBO to BriaFibo for consistency across the module.
      - Modified instances of FIBOEmbedND, FIBOTimesteps, TextProjection, and TimestepProjEmbeddings to reflect the new naming.
      - Ensured all references in the BriaFiboTransformer2DModel are updated accordingly.
      
      * Add BriaFiboTransformer2DModel import to transformers module
      
      * Remove unused BriaFibo imports from modular pipelines and add BriaFiboTransformer2DModel and BriaFiboPipeline classes to dummy objects for enhanced compatibility with torch and transformers.
      
      * Update BriaFibo classes with copied documentation and fix import typo in pipeline module
      
      - Added documentation comments indicating the source of copied code in BriaFiboTransformerBlock and _pack_latents methods.
      - Corrected the import statement for BriaFiboPipeline in the pipelines module.
      
      * Remove unused BriaFibo imports from __init__.py to streamline modular pipelines.
      
      * Refactor documentation comments in BriaFibo classes to indicate inspiration from existing implementations
      
      - Updated comments in BriaFiboAttnProcessor, BriaFiboAttention, and BriaFiboPipeline to reflect that the code is inspired by other modules rather than copied.
      - Enhanced clarity on the origins of the methods to maintain proper attribution.
      
      * change Inspired by to Based on
      
      * add reference link and fix trailing whitespace
      
      * Add BriaFiboTransformer2DModel documentation and update comments in BriaFibo classes
      
      - Introduced a new documentation file for BriaFiboTransformer2DModel.
      - Updated comments in BriaFiboAttnProcessor, BriaFiboAttention, and BriaFiboPipeline to clarify the origins of the code, indicating copied sources for better attribution.
      
      ---------
      Co-authored-by: default avatarsayakpaul <spsayakpaul@gmail.com>
      84e16575
    • Meatfucker's avatar
      Fix typos in kandinsky5 docs (#12552) · 40528e9a
      Meatfucker authored
      Update kandinsky5.md
      
      Fix typos
      40528e9a
    • Lev Novitskiy's avatar
      Kandinsky 5 10 sec (NABLA suport) (#12520) · 5afbcce1
      Lev Novitskiy authored
      
      
      * add transformer pipeline first version
      
      * updates
      
      * fix 5sec generation
      
      * rewrite Kandinsky5T2VPipeline to diffusers style
      
      * add multiprompt support
      
      * remove prints in pipeline
      
      * add nabla attention
      
      * Wrap Transformer in Diffusers style
      
      * fix license
      
      * fix prompt type
      
      * add gradient checkpointing and peft support
      
      * add usage example
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarÁlvaro Somoza <asomoza@users.noreply.github.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarÁlvaro Somoza <asomoza@users.noreply.github.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarÁlvaro Somoza <asomoza@users.noreply.github.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarÁlvaro Somoza <asomoza@users.noreply.github.com>
      
      * Update src/diffusers/models/transformers/transformer_kandinsky.py
      Co-authored-by: default avatarÁlvaro Somoza <asomoza@users.noreply.github.com>
      
      * remove unused imports
      
      * add 10 second models support
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * remove no_grad and simplified prompt paddings
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * moved template to __init__
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/models/transformers/transformer_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * moved sdps inside processor
      
      * remove oneline function
      
      * remove reset_dtype methods
      
      * Transformer: move all methods to forward
      
      * separated prompt encoding
      
      * Update src/diffusers/models/transformers/transformer_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * refactoring
      
      * Update src/diffusers/models/transformers/transformer_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * refactoring acording to https://github.com/huggingface/diffusers/commit/acabbc0033d4b4933fc651766a4aa026db2e6dc1
      
      
      
      * Update src/diffusers/models/transformers/transformer_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/models/transformers/transformer_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/models/transformers/transformer_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/models/transformers/transformer_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/models/transformers/transformer_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/models/transformers/transformer_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/models/transformers/transformer_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/models/transformers/transformer_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/models/transformers/transformer_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * fixed
      
      * style +copies
      
      * Update src/diffusers/models/transformers/transformer_kandinsky.py
      Co-authored-by: default avatarCharles <charles@huggingface.co>
      
      * more
      
      * Apply suggestions from code review
      
      * add lora loader doc
      
      * add compiled Nabla Attention
      
      * all needed changes for 10 sec models are added!
      
      * add docs
      
      * Apply style fixes
      
      * update docs
      
      * add kandinsky5 to toctree
      
      * add tests
      
      * fix tests
      
      * Apply style fixes
      
      * update tests
      
      ---------
      Co-authored-by: default avatarÁlvaro Somoza <asomoza@users.noreply.github.com>
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      Co-authored-by: default avatarCharles <charles@huggingface.co>
      Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
      Co-authored-by: default avatargithub-actions[bot] <github-actions[bot]@users.noreply.github.com>
      5afbcce1
  5. 27 Oct, 2025 2 commits
  6. 24 Oct, 2025 1 commit
  7. 22 Oct, 2025 1 commit
    • David Bertoin's avatar
      Prx (#12525) · dd07b19e
      David Bertoin authored
      * rename photon to prx
      
      * rename photon into prx
      
      * Revert .gitignore to state before commit b7fb0fe9d63bf766bbe3c42ac154a043796dd370
      
      * rename photon to prx
      
      * rename photon into prx
      
      * Revert .gitignore to state before commit b7fb0fe9d63bf766bbe3c42ac154a043796dd370
      
      * make fix-copies
      dd07b19e
  8. 21 Oct, 2025 3 commits
  9. 18 Oct, 2025 1 commit
  10. 17 Oct, 2025 1 commit
  11. 16 Oct, 2025 1 commit
  12. 15 Oct, 2025 2 commits
  13. 14 Oct, 2025 2 commits
  14. 13 Oct, 2025 1 commit
  15. 11 Oct, 2025 1 commit
  16. 06 Oct, 2025 1 commit
  17. 30 Sep, 2025 2 commits
  18. 29 Sep, 2025 3 commits
  19. 26 Sep, 2025 1 commit
  20. 24 Sep, 2025 2 commits
    • DefTruth's avatar
      Introduce cache-dit to community optimization (#12366) · 310fdaf5
      DefTruth authored
      * docs: introduce cache-dit to diffusers
      
      * docs: introduce cache-dit to diffusers
      
      * docs: introduce cache-dit to diffusers
      
      * docs: introduce cache-dit to diffusers
      
      * docs: introduce cache-dit to diffusers
      
      * docs: introduce cache-dit to diffusers
      
      * docs: introduce cache-dit to diffusers
      
      * misc: update examples link
      
      * misc: update examples link
      
      * docs: introduce cache-dit to diffusers
      
      * docs: introduce cache-dit to diffusers
      
      * docs: introduce cache-dit to diffusers
      
      * docs: introduce cache-dit to diffusers
      
      * docs: introduce cache-dit to diffusers
      
      * Refine documentation for CacheDiT features
      
      Updated the wording for clarity and consistency in the documentation. Adjusted sections on cache acceleration, automatic block adapter, patch functor, and hybrid cache configuration.
      310fdaf5
    • Aryan's avatar
      Context Parallel w/ Ring & Ulysses & Unified Attention (#11941) · dcb6dd9b
      Aryan authored
      
      
      * update
      
      * update
      
      * add coauthor
      Co-Authored-By: default avatarDhruv Nair <dhruv.nair@gmail.com>
      
      * improve test
      
      * handle ip adapter params correctly
      
      * fix chroma qkv fusion test
      
      * fix fastercache implementation
      
      * fix more tests
      
      * fight more tests
      
      * add back set_attention_backend
      
      * update
      
      * update
      
      * make style
      
      * make fix-copies
      
      * make ip adapter processor compatible with attention dispatcher
      
      * refactor chroma as well
      
      * remove rmsnorm assert
      
      * minify and deprecate npu/xla processors
      
      * update
      
      * refactor
      
      * refactor; support flash attention 2 with cp
      
      * fix
      
      * support sage attention with cp
      
      * make torch compile compatible
      
      * update
      
      * refactor
      
      * update
      
      * refactor
      
      * refactor
      
      * add ulysses backward
      
      * try to make dreambooth script work; accelerator backward not playing well
      
      * Revert "try to make dreambooth script work; accelerator backward not playing well"
      
      This reverts commit 768d0ea6fa6a305d12df1feda2afae3ec80aa449.
      
      * workaround compilation problems with triton when doing all-to-all
      
      * support wan
      
      * handle backward correctly
      
      * support qwen
      
      * support ltx
      
      * make fix-copies
      
      * Update src/diffusers/models/modeling_utils.py
      Co-authored-by: default avatarDhruv Nair <dhruv.nair@gmail.com>
      
      * apply review suggestions
      
      * update docs
      
      * add explanation
      
      * make fix-copies
      
      * add docstrings
      
      * support passing parallel_config to from_pretrained
      
      * apply review suggestions
      
      * make style
      
      * update
      
      * Update docs/source/en/api/parallel.md
      Co-authored-by: default avatarAryan <aryan@huggingface.co>
      
      * up
      
      ---------
      Co-authored-by: default avatarDhruv Nair <dhruv.nair@gmail.com>
      Co-authored-by: default avatarsayakpaul <spsayakpaul@gmail.com>
      dcb6dd9b
  21. 23 Sep, 2025 2 commits
  22. 22 Sep, 2025 1 commit
  23. 10 Sep, 2025 2 commits
  24. 08 Sep, 2025 1 commit
  25. 05 Sep, 2025 1 commit