1. 10 Nov, 2025 1 commit
  2. 06 Nov, 2025 2 commits
  3. 31 Oct, 2025 1 commit
  4. 28 Oct, 2025 1 commit
    • galbria's avatar
      Bria fibo (#12545) · 84e16575
      galbria authored
      
      
      * Bria FIBO pipeline
      
      * style fixs
      
      * fix CR
      
      * Refactor BriaFibo classes and update pipeline parameters
      
      - Updated BriaFiboAttnProcessor and BriaFiboAttention classes to reflect changes from Flux equivalents.
      - Modified the _unpack_latents method in BriaFiboPipeline to improve clarity.
      - Increased the default max_sequence_length to 3000 and added a new optional parameter do_patching.
      - Cleaned up test_pipeline_bria_fibo.py by removing unused imports and skipping unsupported tests.
      
      * edit the docs of FIBO
      
      * Remove unused BriaFibo imports and update CPU offload method in BriaFiboPipeline
      
      * Refactor FIBO classes to BriaFibo naming convention
      
      - Updated class names from FIBO to BriaFibo for consistency across the module.
      - Modified instances of FIBOEmbedND, FIBOTimesteps, TextProjection, and TimestepProjEmbeddings to reflect the new naming.
      - Ensured all references in the BriaFiboTransformer2DModel are updated accordingly.
      
      * Add BriaFiboTransformer2DModel import to transformers module
      
      * Remove unused BriaFibo imports from modular pipelines and add BriaFiboTransformer2DModel and BriaFiboPipeline classes to dummy objects for enhanced compatibility with torch and transformers.
      
      * Update BriaFibo classes with copied documentation and fix import typo in pipeline module
      
      - Added documentation comments indicating the source of copied code in BriaFiboTransformerBlock and _pack_latents methods.
      - Corrected the import statement for BriaFiboPipeline in the pipelines module.
      
      * Remove unused BriaFibo imports from __init__.py to streamline modular pipelines.
      
      * Refactor documentation comments in BriaFibo classes to indicate inspiration from existing implementations
      
      - Updated comments in BriaFiboAttnProcessor, BriaFiboAttention, and BriaFiboPipeline to reflect that the code is inspired by other modules rather than copied.
      - Enhanced clarity on the origins of the methods to maintain proper attribution.
      
      * change Inspired by to Based on
      
      * add reference link and fix trailing whitespace
      
      * Add BriaFiboTransformer2DModel documentation and update comments in BriaFibo classes
      
      - Introduced a new documentation file for BriaFiboTransformer2DModel.
      - Updated comments in BriaFiboAttnProcessor, BriaFiboAttention, and BriaFiboPipeline to clarify the origins of the code, indicating copied sources for better attribution.
      
      ---------
      Co-authored-by: default avatarsayakpaul <spsayakpaul@gmail.com>
      84e16575
  5. 27 Oct, 2025 1 commit
  6. 24 Oct, 2025 2 commits
  7. 23 Oct, 2025 1 commit
  8. 22 Oct, 2025 1 commit
    • David Bertoin's avatar
      Prx (#12525) · dd07b19e
      David Bertoin authored
      * rename photon to prx
      
      * rename photon into prx
      
      * Revert .gitignore to state before commit b7fb0fe9d63bf766bbe3c42ac154a043796dd370
      
      * rename photon to prx
      
      * rename photon into prx
      
      * Revert .gitignore to state before commit b7fb0fe9d63bf766bbe3c42ac154a043796dd370
      
      * make fix-copies
      dd07b19e
  9. 21 Oct, 2025 1 commit
  10. 20 Oct, 2025 1 commit
  11. 18 Oct, 2025 1 commit
  12. 17 Oct, 2025 1 commit
  13. 10 Oct, 2025 1 commit
  14. 06 Oct, 2025 1 commit
  15. 05 Oct, 2025 1 commit
  16. 30 Sep, 2025 1 commit
  17. 25 Sep, 2025 1 commit
  18. 24 Sep, 2025 1 commit
    • Aryan's avatar
      Context Parallel w/ Ring & Ulysses & Unified Attention (#11941) · dcb6dd9b
      Aryan authored
      
      
      * update
      
      * update
      
      * add coauthor
      Co-Authored-By: default avatarDhruv Nair <dhruv.nair@gmail.com>
      
      * improve test
      
      * handle ip adapter params correctly
      
      * fix chroma qkv fusion test
      
      * fix fastercache implementation
      
      * fix more tests
      
      * fight more tests
      
      * add back set_attention_backend
      
      * update
      
      * update
      
      * make style
      
      * make fix-copies
      
      * make ip adapter processor compatible with attention dispatcher
      
      * refactor chroma as well
      
      * remove rmsnorm assert
      
      * minify and deprecate npu/xla processors
      
      * update
      
      * refactor
      
      * refactor; support flash attention 2 with cp
      
      * fix
      
      * support sage attention with cp
      
      * make torch compile compatible
      
      * update
      
      * refactor
      
      * update
      
      * refactor
      
      * refactor
      
      * add ulysses backward
      
      * try to make dreambooth script work; accelerator backward not playing well
      
      * Revert "try to make dreambooth script work; accelerator backward not playing well"
      
      This reverts commit 768d0ea6fa6a305d12df1feda2afae3ec80aa449.
      
      * workaround compilation problems with triton when doing all-to-all
      
      * support wan
      
      * handle backward correctly
      
      * support qwen
      
      * support ltx
      
      * make fix-copies
      
      * Update src/diffusers/models/modeling_utils.py
      Co-authored-by: default avatarDhruv Nair <dhruv.nair@gmail.com>
      
      * apply review suggestions
      
      * update docs
      
      * add explanation
      
      * make fix-copies
      
      * add docstrings
      
      * support passing parallel_config to from_pretrained
      
      * apply review suggestions
      
      * make style
      
      * update
      
      * Update docs/source/en/api/parallel.md
      Co-authored-by: default avatarAryan <aryan@huggingface.co>
      
      * up
      
      ---------
      Co-authored-by: default avatarDhruv Nair <dhruv.nair@gmail.com>
      Co-authored-by: default avatarsayakpaul <spsayakpaul@gmail.com>
      dcb6dd9b
  19. 23 Sep, 2025 1 commit
  20. 21 Sep, 2025 1 commit
  21. 16 Sep, 2025 1 commit
  22. 09 Sep, 2025 1 commit
  23. 08 Sep, 2025 1 commit
  24. 03 Sep, 2025 2 commits
  25. 31 Aug, 2025 1 commit
  26. 28 Aug, 2025 2 commits
  27. 26 Aug, 2025 1 commit
  28. 22 Aug, 2025 1 commit
  29. 20 Aug, 2025 1 commit
    • galbria's avatar
      Bria 3 2 pipeline (#12010) · 7993be9e
      galbria authored
      
      
      * Add Bria model and pipeline to diffusers
      
      - Introduced `BriaTransformer2DModel` and `BriaPipeline` for enhanced image generation capabilities.
      - Updated import structures across various modules to include the new Bria components.
      - Added utility functions and output classes specific to the Bria pipeline.
      - Implemented tests for the Bria pipeline to ensure functionality and output integrity.
      
      * with working tests
      
      * style and quality pass
      
      * adding docs
      
      * add to overview
      
      * fixes from "make fix-copies"
      
      * Refactor transformer_bria.py and pipeline_bria.py: Introduce new EmbedND class for rotary position embedding, and enhance Timestep and TimestepProjEmbeddings classes. Add utility functions for handling negative prompts and generating original sigmas in pipeline_bria.py.
      
      * remove redundent and duplicates tests and fix bf16
      slow test
      
      * style fixes
      
      * small doc update
      
      * Enhance Bria 3.2 documentation and implementation
      
      - Updated the GitHub repository link for Bria 3.2.
      - Added usage instructions for the gated model access.
      - Introduced the BriaTransformerBlock and BriaAttention classes to the model architecture.
      - Refactored existing classes to integrate Bria-specific components, including BriaEmbedND and BriaPipeline.
      - Updated the pipeline output class to reflect Bria-specific functionality.
      - Adjusted test cases to align with the new Bria model structure.
      
      * Refactor Bria model components and update documentation
      
      - Removed outdated inference example from Bria 3.2 documentation.
      - Introduced the BriaTransformerBlock class to enhance model architecture.
      - Updated attention handling to use `attention_kwargs` instead of `joint_attention_kwargs`.
      - Improved import structure in the Bria pipeline to handle optional dependencies.
      - Adjusted test cases to reflect changes in model dtype assertions.
      
      * Update Bria model reference in documentation to reflect new file naming convention
      
      * Update docs/source/en/_toctree.yml
      
      * Refactor BriaPipeline to inherit from DiffusionPipeline instead of FluxPipeline, updating imports accordingly.
      
      * move the __call__ func to the end of file
      
      * Update BriaPipeline example to use bfloat16 for precision sensitivity for better result
      
      * make style && make quality &&  make fix-copiessource
      
      ---------
      Co-authored-by: default avatarLinoy Tsaban <57615435+linoytsaban@users.noreply.github.com>
      Co-authored-by: default avatarAryan <contact.aryanvs@gmail.com>
      7993be9e
  30. 17 Aug, 2025 1 commit
    • naykun's avatar
      Qwen Image Edit Support (#12164) · e682af20
      naykun authored
      * feat(qwen-image):
      add qwen-image-edit support
      
      * fix(qwen image):
      - compatible with torch.compile in new rope setting
      - fix init import
      - add prompt truncation in img2img and inpaint pipe
      - remove unused logic and comment
      - add copy statement
      - guard logic for rope video shape tuple
      
      * fix(qwen image):
      - make fix-copies
      - update doc
      e682af20
  31. 14 Aug, 2025 3 commits
  32. 13 Aug, 2025 2 commits
  33. 08 Aug, 2025 1 commit