1. 25 Nov, 2025 2 commits
    • Sayak Paul's avatar
      let's go Flux2 🚀 (#12711) · 5ffb73d4
      Sayak Paul authored
      
      
      * add vae
      
      * Initial commit for Flux 2 Transformer implementation
      
      * add pipeline part
      
      * small edits to the pipeline and conversion
      
      * update conversion script
      
      * fix
      
      * up up
      
      * finish pipeline
      
      * Remove Flux IP Adapter logic for now
      
      * Remove deprecated 3D id logic
      
      * Remove ControlNet logic for now
      
      * Add link to ViT-22B paper as reference for parallel transformer blocks such as the Flux 2 single stream block
      
      * update pipeline
      
      * Don't use biases for input projs and output AdaNorm
      
      * up
      
      * Remove bias for double stream block text QKV projections
      
      * Add script to convert Flux 2 transformer to diffusers
      
      * make style and make quality
      
      * fix a few things.
      
      * allow sft files to go.
      
      * fix image processor
      
      * fix batch
      
      * style a bit
      
      * Fix some bugs in Flux 2 transformer implementation
      
      * Fix dummy input preparation and fix some test bugs
      
      * fix dtype casting in timestep guidance module.
      
      * resolve conflicts.,
      
      * remove ip adapter stuff.
      
      * Fix Flux 2 transformer consistency test
      
      * Fix bug in Flux2TransformerBlock (double stream block)
      
      * Get remaining Flux 2 transformer tests passing
      
      * make style; make quality; make fix-copies
      
      * remove stuff.
      
      * fix type annotaton.
      
      * remove unneeded stuff from tests
      
      * tests
      
      * up
      
      * up
      
      * add sf support
      
      * Remove unused IP Adapter and ControlNet logic from transformer (#9)
      
      * copied from
      
      * Apply suggestions from code review
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      Co-authored-by: default avatarapolinário <joaopaulo.passos@gmail.com>
      
      * up
      
      * up
      
      * up
      
      * up
      
      * up
      
      * Refactor Flux2Attention into separate classes for double stream and single stream attention
      
      * Add _supports_qkv_fusion to AttentionModuleMixin to allow subclasses to disable QKV fusion
      
      * Have Flux2ParallelSelfAttention inherit from AttentionModuleMixin with _supports_qkv_fusion=False
      
      * Log debug message when calling fuse_projections on a AttentionModuleMixin subclass that does not support QKV fusion
      
      * Address review comments
      
      * Update src/diffusers/pipelines/flux2/pipeline_flux2.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * up
      
      * Remove maybe_allow_in_graph decorators for Flux 2 transformer blocks (#12)
      
      * up
      
      * support ostris loras. (#13)
      
      * up
      
      * update schdule
      
      * up
      
      * up (#17)
      
      * add training scripts (#16)
      
      * add training scripts
      Co-authored-by: default avatarLinoy Tsaban <linoytsaban@gmail.com>
      
      * model cpu offload in validation.
      
      * add flux.2 readme
      
      * add img2img and tests
      
      * cpu offload in log validation
      
      * Apply suggestions from code review
      
      * fix
      
      * up
      
      * fixes
      
      * remove i2i training tests for now.
      
      ---------
      Co-authored-by: default avatarLinoy Tsaban <linoytsaban@gmail.com>
      Co-authored-by: default avatarlinoytsaban <linoy@huggingface.co>
      
      * up
      
      ---------
      Co-authored-by: default avataryiyixuxu <yixu310@gmail.com>
      Co-authored-by: default avatarDaniel Gu <dgu8957@gmail.com>
      Co-authored-by: default avataryiyi@huggingface.co <yiyi@ip-10-53-87-203.ec2.internal>
      Co-authored-by: default avatardg845 <58458699+dg845@users.noreply.github.com>
      Co-authored-by: default avatarDhruv Nair <dhruv.nair@gmail.com>
      Co-authored-by: default avatarapolinário <joaopaulo.passos@gmail.com>
      Co-authored-by: default avataryiyi@huggingface.co <yiyi@ip-26-0-160-103.ec2.internal>
      Co-authored-by: default avatarLinoy Tsaban <linoytsaban@gmail.com>
      Co-authored-by: default avatarlinoytsaban <linoy@huggingface.co>
      5ffb73d4
    • Jerry Wu's avatar
      Add Support for Z-Image Series (#12703) · 4088e8a8
      Jerry Wu authored
      
      
      * Add Support for Z-Image.
      
      * Reformatting with make style, black & isort.
      
      * Remove init, Modify import utils, Merge forward in transformers block, Remove once func in pipeline.
      
      * modified main model forward, freqs_cis left
      
      * refactored to add B dim
      
      * fixed stack issue
      
      * fixed modulation bug
      
      * fixed modulation bug
      
      * fix bug
      
      * remove value_from_time_aware_config
      
      * styling
      
      * Fix neg embed and devide / bug; Reuse pad zero tensor; Turn cat -> repeat; Add hint for attn processor.
      
      * Replace padding with pad_sequence; Add gradient checkpointing.
      
      * Fix flash_attn3 in dispatch attn backend by _flash_attn_forward, replace its origin implement; Add DocString in pipeline for that.
      
      * Fix Docstring and Make Style.
      
      * Revert "Fix flash_attn3 in dispatch attn backend by _flash_attn_forward, replace its origin implement; Add DocString in pipeline for that."
      
      This reverts commit fbf26b7ed11d55146103c97740bad4a5f91744e0.
      
      * update z-image docstring
      
      * Revert attention dispatcher
      
      * update z-image docstring
      
      * styling
      
      * Recover attention_dispatch.py with its origin impl, later would special commit for fa3 compatibility.
      
      * Fix prev bug, and support for prompt_embeds pass in args after prompt pre-encode as List of torch Tensor.
      
      * Remove einop dependency.
      
      * remove redundant imports & make fix-copies
      
      * fix import
      
      ---------
      Co-authored-by: default avatarliudongyang <liudongyang0114@gmail.com>
      4088e8a8
  2. 24 Nov, 2025 1 commit
  3. 19 Nov, 2025 2 commits
  4. 17 Nov, 2025 1 commit
  5. 13 Nov, 2025 3 commits
  6. 12 Nov, 2025 1 commit
    • Sayak Paul's avatar
      [modular] add tests for qwen modular (#12585) · f5e5f348
      Sayak Paul authored
      * add tests for qwenimage modular.
      
      * qwenimage edit.
      
      * qwenimage edit plus.
      
      * empty
      
      * align with the latest structure
      
      * up
      
      * up
      
      * reason
      
      * up
      
      * fix multiple issues.
      
      * up
      
      * up
      
      * fix
      
      * up
      
      * make it similar to the original pipeline.
      f5e5f348
  7. 11 Nov, 2025 1 commit
  8. 10 Nov, 2025 1 commit
  9. 07 Nov, 2025 1 commit
  10. 06 Nov, 2025 1 commit
  11. 02 Nov, 2025 1 commit
  12. 28 Oct, 2025 4 commits
    • galbria's avatar
      Bria fibo (#12545) · 84e16575
      galbria authored
      
      
      * Bria FIBO pipeline
      
      * style fixs
      
      * fix CR
      
      * Refactor BriaFibo classes and update pipeline parameters
      
      - Updated BriaFiboAttnProcessor and BriaFiboAttention classes to reflect changes from Flux equivalents.
      - Modified the _unpack_latents method in BriaFiboPipeline to improve clarity.
      - Increased the default max_sequence_length to 3000 and added a new optional parameter do_patching.
      - Cleaned up test_pipeline_bria_fibo.py by removing unused imports and skipping unsupported tests.
      
      * edit the docs of FIBO
      
      * Remove unused BriaFibo imports and update CPU offload method in BriaFiboPipeline
      
      * Refactor FIBO classes to BriaFibo naming convention
      
      - Updated class names from FIBO to BriaFibo for consistency across the module.
      - Modified instances of FIBOEmbedND, FIBOTimesteps, TextProjection, and TimestepProjEmbeddings to reflect the new naming.
      - Ensured all references in the BriaFiboTransformer2DModel are updated accordingly.
      
      * Add BriaFiboTransformer2DModel import to transformers module
      
      * Remove unused BriaFibo imports from modular pipelines and add BriaFiboTransformer2DModel and BriaFiboPipeline classes to dummy objects for enhanced compatibility with torch and transformers.
      
      * Update BriaFibo classes with copied documentation and fix import typo in pipeline module
      
      - Added documentation comments indicating the source of copied code in BriaFiboTransformerBlock and _pack_latents methods.
      - Corrected the import statement for BriaFiboPipeline in the pipelines module.
      
      * Remove unused BriaFibo imports from __init__.py to streamline modular pipelines.
      
      * Refactor documentation comments in BriaFibo classes to indicate inspiration from existing implementations
      
      - Updated comments in BriaFiboAttnProcessor, BriaFiboAttention, and BriaFiboPipeline to reflect that the code is inspired by other modules rather than copied.
      - Enhanced clarity on the origins of the methods to maintain proper attribution.
      
      * change Inspired by to Based on
      
      * add reference link and fix trailing whitespace
      
      * Add BriaFiboTransformer2DModel documentation and update comments in BriaFibo classes
      
      - Introduced a new documentation file for BriaFiboTransformer2DModel.
      - Updated comments in BriaFiboAttnProcessor, BriaFiboAttention, and BriaFiboPipeline to clarify the origins of the code, indicating copied sources for better attribution.
      
      ---------
      Co-authored-by: default avatarsayakpaul <spsayakpaul@gmail.com>
      84e16575
    • Sayak Paul's avatar
      [ci] don't run sana layerwise casting tests in CI. (#12551) · 55d49d43
      Sayak Paul authored
      * don't run sana layerwise casting tests in CI.
      
      * up
      55d49d43
    • Dhruv Nair's avatar
      [Pipelines] Enable Wan VACE to run since single transformer (#12428) · ecfbc8f9
      Dhruv Nair authored
      * update
      
      * update
      
      * update
      
      * update
      
      * update
      ecfbc8f9
    • Lev Novitskiy's avatar
      Kandinsky 5 10 sec (NABLA suport) (#12520) · 5afbcce1
      Lev Novitskiy authored
      
      
      * add transformer pipeline first version
      
      * updates
      
      * fix 5sec generation
      
      * rewrite Kandinsky5T2VPipeline to diffusers style
      
      * add multiprompt support
      
      * remove prints in pipeline
      
      * add nabla attention
      
      * Wrap Transformer in Diffusers style
      
      * fix license
      
      * fix prompt type
      
      * add gradient checkpointing and peft support
      
      * add usage example
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarÁlvaro Somoza <asomoza@users.noreply.github.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarÁlvaro Somoza <asomoza@users.noreply.github.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarÁlvaro Somoza <asomoza@users.noreply.github.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarÁlvaro Somoza <asomoza@users.noreply.github.com>
      
      * Update src/diffusers/models/transformers/transformer_kandinsky.py
      Co-authored-by: default avatarÁlvaro Somoza <asomoza@users.noreply.github.com>
      
      * remove unused imports
      
      * add 10 second models support
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * remove no_grad and simplified prompt paddings
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * moved template to __init__
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/models/transformers/transformer_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * moved sdps inside processor
      
      * remove oneline function
      
      * remove reset_dtype methods
      
      * Transformer: move all methods to forward
      
      * separated prompt encoding
      
      * Update src/diffusers/models/transformers/transformer_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * refactoring
      
      * Update src/diffusers/models/transformers/transformer_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * refactoring acording to https://github.com/huggingface/diffusers/commit/acabbc0033d4b4933fc651766a4aa026db2e6dc1
      
      
      
      * Update src/diffusers/models/transformers/transformer_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/models/transformers/transformer_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/models/transformers/transformer_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/models/transformers/transformer_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/models/transformers/transformer_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/models/transformers/transformer_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/models/transformers/transformer_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/models/transformers/transformer_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/models/transformers/transformer_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * Update src/diffusers/pipelines/kandinsky5/pipeline_kandinsky.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * fixed
      
      * style +copies
      
      * Update src/diffusers/models/transformers/transformer_kandinsky.py
      Co-authored-by: default avatarCharles <charles@huggingface.co>
      
      * more
      
      * Apply suggestions from code review
      
      * add lora loader doc
      
      * add compiled Nabla Attention
      
      * all needed changes for 10 sec models are added!
      
      * add docs
      
      * Apply style fixes
      
      * update docs
      
      * add kandinsky5 to toctree
      
      * add tests
      
      * fix tests
      
      * Apply style fixes
      
      * update tests
      
      ---------
      Co-authored-by: default avatarÁlvaro Somoza <asomoza@users.noreply.github.com>
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      Co-authored-by: default avatarCharles <charles@huggingface.co>
      Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
      Co-authored-by: default avatargithub-actions[bot] <github-actions[bot]@users.noreply.github.com>
      5afbcce1
  13. 27 Oct, 2025 1 commit
  14. 24 Oct, 2025 2 commits
  15. 23 Oct, 2025 1 commit
  16. 22 Oct, 2025 3 commits
  17. 21 Oct, 2025 2 commits
  18. 17 Oct, 2025 3 commits
  19. 15 Oct, 2025 1 commit
  20. 02 Oct, 2025 2 commits
    • Benjamin Bossan's avatar
      FIX Test to ignore warning for enable_lora_hotswap (#12421) · 7242b5ff
      Benjamin Bossan authored
      I noticed that the test should be for the option check_compiled="ignore"
      but it was using check_compiled="warn". This has been fixed, now the
      correct argument is passed.
      
      However, the fact that the test passed means that it was incorrect to
      begin with. The way that logs are collected does not collect the
      logger.warning call here (not sure why). To amend this, I'm now using
      assertNoLogs. With this change, the test correctly fails when the wrong
      argument is passed.
      7242b5ff
    • Sayak Paul's avatar
      [ci] xfail failing tests in CI. (#12418) · 9ae5b629
      Sayak Paul authored
      xfail failing tests in CI.
      9ae5b629
  21. 01 Oct, 2025 1 commit
    • Sayak Paul's avatar
      [tests] cache non lora pipeline outputs. (#12298) · 814d710e
      Sayak Paul authored
      * cache non lora pipeline outputs.
      
      * up
      
      * up
      
      * up
      
      * up
      
      * Revert "up"
      
      This reverts commit 772c32e43397f25919c29bbbe8ef9dc7d581cfb8.
      
      * up
      
      * Revert "up"
      
      This reverts commit cca03df7fce55550ed28b59cadec12d1db188283.
      
      * up
      
      * up
      
      * add .
      
      * up
      
      * up
      
      * up
      
      * up
      
      * up
      
      * up
      814d710e
  22. 30 Sep, 2025 3 commits
  23. 29 Sep, 2025 2 commits