1. 28 Oct, 2025 1 commit
  2. 27 Oct, 2025 2 commits
  3. 24 Oct, 2025 1 commit
  4. 22 Oct, 2025 1 commit
    • David Bertoin's avatar
      Prx (#12525) · dd07b19e
      David Bertoin authored
      * rename photon to prx
      
      * rename photon into prx
      
      * Revert .gitignore to state before commit b7fb0fe9d63bf766bbe3c42ac154a043796dd370
      
      * rename photon to prx
      
      * rename photon into prx
      
      * Revert .gitignore to state before commit b7fb0fe9d63bf766bbe3c42ac154a043796dd370
      
      * make fix-copies
      dd07b19e
  5. 21 Oct, 2025 3 commits
  6. 18 Oct, 2025 1 commit
  7. 17 Oct, 2025 1 commit
  8. 16 Oct, 2025 1 commit
  9. 15 Oct, 2025 2 commits
  10. 14 Oct, 2025 2 commits
  11. 13 Oct, 2025 1 commit
  12. 11 Oct, 2025 1 commit
  13. 30 Sep, 2025 2 commits
  14. 29 Sep, 2025 3 commits
  15. 26 Sep, 2025 1 commit
  16. 24 Sep, 2025 2 commits
    • DefTruth's avatar
      Introduce cache-dit to community optimization (#12366) · 310fdaf5
      DefTruth authored
      * docs: introduce cache-dit to diffusers
      
      * docs: introduce cache-dit to diffusers
      
      * docs: introduce cache-dit to diffusers
      
      * docs: introduce cache-dit to diffusers
      
      * docs: introduce cache-dit to diffusers
      
      * docs: introduce cache-dit to diffusers
      
      * docs: introduce cache-dit to diffusers
      
      * misc: update examples link
      
      * misc: update examples link
      
      * docs: introduce cache-dit to diffusers
      
      * docs: introduce cache-dit to diffusers
      
      * docs: introduce cache-dit to diffusers
      
      * docs: introduce cache-dit to diffusers
      
      * docs: introduce cache-dit to diffusers
      
      * Refine documentation for CacheDiT features
      
      Updated the wording for clarity and consistency in the documentation. Adjusted sections on cache acceleration, automatic block adapter, patch functor, and hybrid cache configuration.
      310fdaf5
    • Aryan's avatar
      Context Parallel w/ Ring & Ulysses & Unified Attention (#11941) · dcb6dd9b
      Aryan authored
      
      
      * update
      
      * update
      
      * add coauthor
      Co-Authored-By: default avatarDhruv Nair <dhruv.nair@gmail.com>
      
      * improve test
      
      * handle ip adapter params correctly
      
      * fix chroma qkv fusion test
      
      * fix fastercache implementation
      
      * fix more tests
      
      * fight more tests
      
      * add back set_attention_backend
      
      * update
      
      * update
      
      * make style
      
      * make fix-copies
      
      * make ip adapter processor compatible with attention dispatcher
      
      * refactor chroma as well
      
      * remove rmsnorm assert
      
      * minify and deprecate npu/xla processors
      
      * update
      
      * refactor
      
      * refactor; support flash attention 2 with cp
      
      * fix
      
      * support sage attention with cp
      
      * make torch compile compatible
      
      * update
      
      * refactor
      
      * update
      
      * refactor
      
      * refactor
      
      * add ulysses backward
      
      * try to make dreambooth script work; accelerator backward not playing well
      
      * Revert "try to make dreambooth script work; accelerator backward not playing well"
      
      This reverts commit 768d0ea6fa6a305d12df1feda2afae3ec80aa449.
      
      * workaround compilation problems with triton when doing all-to-all
      
      * support wan
      
      * handle backward correctly
      
      * support qwen
      
      * support ltx
      
      * make fix-copies
      
      * Update src/diffusers/models/modeling_utils.py
      Co-authored-by: default avatarDhruv Nair <dhruv.nair@gmail.com>
      
      * apply review suggestions
      
      * update docs
      
      * add explanation
      
      * make fix-copies
      
      * add docstrings
      
      * support passing parallel_config to from_pretrained
      
      * apply review suggestions
      
      * make style
      
      * update
      
      * Update docs/source/en/api/parallel.md
      Co-authored-by: default avatarAryan <aryan@huggingface.co>
      
      * up
      
      ---------
      Co-authored-by: default avatarDhruv Nair <dhruv.nair@gmail.com>
      Co-authored-by: default avatarsayakpaul <spsayakpaul@gmail.com>
      dcb6dd9b
  17. 23 Sep, 2025 2 commits
  18. 22 Sep, 2025 1 commit
  19. 10 Sep, 2025 2 commits
  20. 08 Sep, 2025 1 commit
  21. 05 Sep, 2025 2 commits
  22. 04 Sep, 2025 1 commit
  23. 03 Sep, 2025 2 commits
    • Ishan Modi's avatar
      [Quantization] Add TRT-ModelOpt as a Backend (#11173) · 4acbfbf1
      Ishan Modi authored
      
      
      * initial commit
      
      * update
      
      * updates
      
      * update
      
      * update
      
      * update
      
      * update
      
      * update
      
      * update
      
      * addressed PR comments
      
      * update
      
      * addressed PR comments
      
      * update
      
      * update
      
      * update
      
      * update
      
      * update
      
      * update
      
      * updates
      
      * update
      
      * update
      
      * addressed PR comments
      
      * updates
      
      * code formatting
      
      * update
      
      * addressed PR comments
      
      * addressed PR comments
      
      * addressed PR comments
      
      * addressed PR comments
      
      * fix docs and dependencies
      
      * fixed dependency test
      
      ---------
      Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
      4acbfbf1
    • Steven Liu's avatar
      [docs] AutoPipeline (#12160) · 6549b04e
      Steven Liu authored
      * refresh
      
      * feedback
      
      * feedback
      
      * supported models
      
      * fix
      6549b04e
  24. 02 Sep, 2025 1 commit
  25. 31 Aug, 2025 1 commit
  26. 28 Aug, 2025 1 commit
  27. 27 Aug, 2025 1 commit