1. 13 Sep, 2024 1 commit
  2. 03 Sep, 2024 1 commit
    • Vishnu V Jaddipal's avatar
      Xlabs lora fix (#9348) · 1c1ccaa0
      Vishnu V Jaddipal authored
      
      
      * Fix ```from_single_file``` for xl_inpaint
      
      * Add basic flux inpaint pipeline
      
      * style, quality, stray print
      
      * Fix stray changes
      
      * Add inpainting model support
      
      * Change lora conversion for xlabs
      
      * Fix stray changes
      
      * Apply suggestions from code review
      
      * style
      
      ---------
      Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
      1c1ccaa0
  3. 30 Aug, 2024 1 commit
  4. 29 Aug, 2024 1 commit
  5. 26 Aug, 2024 1 commit
  6. 23 Aug, 2024 1 commit
  7. 22 Aug, 2024 2 commits
  8. 21 Aug, 2024 1 commit
  9. 20 Aug, 2024 1 commit
  10. 17 Aug, 2024 1 commit
  11. 07 Aug, 2024 2 commits
  12. 05 Aug, 2024 1 commit
    • Sayak Paul's avatar
      [FLUX] support LoRA (#9057) · fc6a91e3
      Sayak Paul authored
      * feat: lora support for Flux.
      
      add tests
      
      fix imports
      
      major fixes.
      
      * fix
      
      fixes
      
      final fixes?
      
      * fix
      
      * remove is_peft_available.
      fc6a91e3
  13. 30 Jul, 2024 1 commit
  14. 26 Jul, 2024 2 commits
    • Álvaro Somoza's avatar
      [Kolors] Add IP Adapter (#8901) · 73acebb8
      Álvaro Somoza authored
      * initial draft
      
      * apply suggestions
      
      * fix failing test
      
      * added ipa to img2img
      
      * add docs
      
      * apply suggestions
      73acebb8
    • Sayak Paul's avatar
      [Chore] add `LoraLoaderMixin` to the inits (#8981) · d87fe95f
      Sayak Paul authored
      
      
      * introduce  to promote reusability.
      
      * up
      
      * add more tests
      
      * up
      
      * remove comments.
      
      * fix fuse_nan test
      
      * clarify the scope of fuse_lora and unfuse_lora
      
      * remove space
      
      * rewrite fuse_lora a bit.
      
      * feedback
      
      * copy over load_lora_into_text_encoder.
      
      * address dhruv's feedback.
      
      * fix-copies
      
      * fix issubclass.
      
      * num_fused_loras
      
      * fix
      
      * fix
      
      * remove mapping
      
      * up
      
      * fix
      
      * style
      
      * fix-copies
      
      * change to SD3TransformerLoRALoadersMixin
      
      * Apply suggestions from code review
      Co-authored-by: default avatarDhruv Nair <dhruv.nair@gmail.com>
      
      * up
      
      * handle wuerstchen
      
      * up
      
      * move lora to lora_pipeline.py
      
      * up
      
      * fix-copies
      
      * fix documentation.
      
      * comment set_adapters().
      
      * fix-copies
      
      * fix set_adapters() at the model level.
      
      * fix?
      
      * fix
      
      * loraloadermixin.
      
      ---------
      Co-authored-by: default avatarDhruv Nair <dhruv.nair@gmail.com>
      d87fe95f
  15. 25 Jul, 2024 2 commits
    • YiYi Xu's avatar
      Revert "[LoRA] introduce LoraBaseMixin to promote reusability." (#8976) · 62863bb1
      YiYi Xu authored
      Revert "[LoRA] introduce LoraBaseMixin to promote reusability. (#8774)"
      
      This reverts commit 527430d0.
      62863bb1
    • Sayak Paul's avatar
      [LoRA] introduce LoraBaseMixin to promote reusability. (#8774) · 527430d0
      Sayak Paul authored
      
      
      * introduce  to promote reusability.
      
      * up
      
      * add more tests
      
      * up
      
      * remove comments.
      
      * fix fuse_nan test
      
      * clarify the scope of fuse_lora and unfuse_lora
      
      * remove space
      
      * rewrite fuse_lora a bit.
      
      * feedback
      
      * copy over load_lora_into_text_encoder.
      
      * address dhruv's feedback.
      
      * fix-copies
      
      * fix issubclass.
      
      * num_fused_loras
      
      * fix
      
      * fix
      
      * remove mapping
      
      * up
      
      * fix
      
      * style
      
      * fix-copies
      
      * change to SD3TransformerLoRALoadersMixin
      
      * Apply suggestions from code review
      Co-authored-by: default avatarDhruv Nair <dhruv.nair@gmail.com>
      
      * up
      
      * handle wuerstchen
      
      * up
      
      * move lora to lora_pipeline.py
      
      * up
      
      * fix-copies
      
      * fix documentation.
      
      * comment set_adapters().
      
      * fix-copies
      
      * fix set_adapters() at the model level.
      
      * fix?
      
      * fix
      
      ---------
      Co-authored-by: default avatarDhruv Nair <dhruv.nair@gmail.com>
      527430d0
  16. 18 Jul, 2024 1 commit
  17. 12 Jul, 2024 1 commit
  18. 05 Jul, 2024 1 commit
  19. 03 Jul, 2024 2 commits
  20. 01 Jul, 2024 1 commit
  21. 28 Jun, 2024 1 commit
  22. 27 Jun, 2024 1 commit
  23. 26 Jun, 2024 2 commits
  24. 25 Jun, 2024 2 commits
  25. 22 Jun, 2024 1 commit
  26. 21 Jun, 2024 1 commit
  27. 20 Jun, 2024 1 commit
  28. 19 Jun, 2024 1 commit
  29. 18 Jun, 2024 2 commits
  30. 13 Jun, 2024 1 commit
  31. 12 Jun, 2024 1 commit
  32. 05 Jun, 2024 1 commit
    • Sayak Paul's avatar
      [LoRA] Remove legacy LoRA code and related adjustments (#8316) · a0542c19
      Sayak Paul authored
      * remove legacy code from load_attn_procs.
      
      * finish first draft
      
      * fix more.
      
      * fix more
      
      * add test
      
      * add serialization support.
      
      * fix-copies
      
      * require peft backend for lora tests
      
      * style
      
      * fix test
      
      * fix loading.
      
      * empty
      
      * address benjamin's feedback.
      a0542c19