1. 08 Mar, 2025 1 commit
  2. 04 Mar, 2025 2 commits
  3. 20 Feb, 2025 1 commit
  4. 15 Jan, 2025 1 commit
  5. 09 Jan, 2025 1 commit
  6. 07 Jan, 2025 1 commit
  7. 06 Jan, 2025 1 commit
  8. 02 Jan, 2025 1 commit
  9. 25 Dec, 2024 1 commit
  10. 20 Dec, 2024 1 commit
  11. 19 Dec, 2024 1 commit
  12. 18 Dec, 2024 1 commit
    • Sayak Paul's avatar
      [LoRA] feat: lora support for SANA. (#10234) · 9408aa2d
      Sayak Paul authored
      
      
      * feat: lora support for SANA.
      
      * make fix-copies
      
      * rename test class.
      
      * attention_kwargs -> cross_attention_kwargs.
      
      * Revert "attention_kwargs -> cross_attention_kwargs."
      
      This reverts commit 23433bf9bccc12e0f2f55df26bae58a894e8b43b.
      
      * exhaust 119 max line limit
      
      * sana lora fine-tuning script.
      
      * readme
      
      * add a note about the supported models.
      
      * Apply suggestions from code review
      Co-authored-by: default avatarAryan <aryan@huggingface.co>
      
      * style
      
      * docs for attention_kwargs.
      
      * remove lora_scale from pag pipeline.
      
      * copy fix
      
      ---------
      Co-authored-by: default avatarAryan <aryan@huggingface.co>
      9408aa2d
  13. 17 Dec, 2024 2 commits
  14. 15 Dec, 2024 1 commit
  15. 11 Dec, 2024 1 commit
  16. 10 Dec, 2024 1 commit
  17. 20 Nov, 2024 1 commit
  18. 19 Nov, 2024 1 commit
  19. 02 Nov, 2024 1 commit
  20. 16 Oct, 2024 1 commit
  21. 09 Oct, 2024 1 commit
  22. 08 Oct, 2024 1 commit
    • Sayak Paul's avatar
      [LoRA] Handle DoRA better (#9547) · 02eeb8e7
      Sayak Paul authored
      * handle dora.
      
      * print test
      
      * debug
      
      * fix
      
      * fix-copies
      
      * update logits
      
      * add warning in the test.
      
      * make is_dora check consistent.
      
      * fix-copies
      02eeb8e7
  23. 19 Sep, 2024 1 commit
    • Aryan's avatar
      [training] CogVideoX Lora (#9302) · 2b443a5d
      Aryan authored
      
      
      * cogvideox lora training draft
      
      * update
      
      * update
      
      * update
      
      * update
      
      * update
      
      * make fix-copies
      
      * update
      
      * update
      
      * apply suggestions from review
      
      * apply suggestions from reveiw
      
      * fix typo
      
      * Update examples/cogvideo/train_cogvideox_lora.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * fix lora alpha
      
      * use correct lora scaling for final test pipeline
      
      * Update examples/cogvideo/train_cogvideox_lora.py
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * apply suggestions from review; prodigy optimizer
      
      YiYi Xu <yixu310@gmail.com>
      
      * add tests
      
      * make style
      
      * add README
      
      * update
      
      * update
      
      * make style
      
      * fix
      
      * update
      
      * add test skeleton
      
      * revert lora utils changes
      
      * add cleaner modifications to lora testing utils
      
      * update lora tests
      
      * deepspeed stuff
      
      * add requirements.txt
      
      * deepspeed refactor
      
      * add lora stuff to img2vid pipeline to fix tests
      
      * fight tests
      
      * add co-authors
      Co-Authored-By: default avatarFu-Yun Wang <1697256461@qq.com>
      Co-Authored-By: default avatarzR <2448370773@qq.com>
      
      * fight lora runner tests
      
      * import Dummy optim and scheduler only wheh required
      
      * update docs
      
      * add coauthors
      Co-Authored-By: default avatarFu-Yun Wang <1697256461@qq.com>
      
      * remove option to train text encoder
      Co-Authored-By: default avatarbghira <bghira@users.github.com>
      
      * update tests
      
      * fight more tests
      
      * update
      
      * fix vid2vid
      
      * fix typo
      
      * remove lora tests; todo in follow-up PR
      
      * undo img2vid changes
      
      * remove text encoder related changes in lora loader mixin
      
      * Revert "remove text encoder related changes in lora loader mixin"
      
      This reverts commit f8a8444487db27859be812866db4e8cec7f25691.
      
      * update
      
      * round 1 of fighting tests
      
      * round 2 of fighting tests
      
      * fix copied from comment
      
      * fix typo in lora test
      
      * update styling
      Co-Authored-By: default avatarYiYi Xu <yixu310@gmail.com>
      
      ---------
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      Co-authored-by: default avatarzR <2448370773@qq.com>
      Co-authored-by: default avatarFu-Yun Wang <1697256461@qq.com>
      Co-authored-by: default avatarbghira <bghira@users.github.com>
      2b443a5d
  24. 29 Aug, 2024 1 commit
  25. 22 Aug, 2024 2 commits
  26. 05 Aug, 2024 1 commit
    • Sayak Paul's avatar
      [FLUX] support LoRA (#9057) · fc6a91e3
      Sayak Paul authored
      * feat: lora support for Flux.
      
      add tests
      
      fix imports
      
      major fixes.
      
      * fix
      
      fixes
      
      final fixes?
      
      * fix
      
      * remove is_peft_available.
      fc6a91e3
  27. 26 Jul, 2024 1 commit
    • Sayak Paul's avatar
      [Chore] add `LoraLoaderMixin` to the inits (#8981) · d87fe95f
      Sayak Paul authored
      
      
      * introduce  to promote reusability.
      
      * up
      
      * add more tests
      
      * up
      
      * remove comments.
      
      * fix fuse_nan test
      
      * clarify the scope of fuse_lora and unfuse_lora
      
      * remove space
      
      * rewrite fuse_lora a bit.
      
      * feedback
      
      * copy over load_lora_into_text_encoder.
      
      * address dhruv's feedback.
      
      * fix-copies
      
      * fix issubclass.
      
      * num_fused_loras
      
      * fix
      
      * fix
      
      * remove mapping
      
      * up
      
      * fix
      
      * style
      
      * fix-copies
      
      * change to SD3TransformerLoRALoadersMixin
      
      * Apply suggestions from code review
      Co-authored-by: default avatarDhruv Nair <dhruv.nair@gmail.com>
      
      * up
      
      * handle wuerstchen
      
      * up
      
      * move lora to lora_pipeline.py
      
      * up
      
      * fix-copies
      
      * fix documentation.
      
      * comment set_adapters().
      
      * fix-copies
      
      * fix set_adapters() at the model level.
      
      * fix?
      
      * fix
      
      * loraloadermixin.
      
      ---------
      Co-authored-by: default avatarDhruv Nair <dhruv.nair@gmail.com>
      d87fe95f
  28. 25 Jul, 2024 2 commits
    • YiYi Xu's avatar
      Revert "[LoRA] introduce LoraBaseMixin to promote reusability." (#8976) · 62863bb1
      YiYi Xu authored
      Revert "[LoRA] introduce LoraBaseMixin to promote reusability. (#8774)"
      
      This reverts commit 527430d0.
      62863bb1
    • Sayak Paul's avatar
      [LoRA] introduce LoraBaseMixin to promote reusability. (#8774) · 527430d0
      Sayak Paul authored
      
      
      * introduce  to promote reusability.
      
      * up
      
      * add more tests
      
      * up
      
      * remove comments.
      
      * fix fuse_nan test
      
      * clarify the scope of fuse_lora and unfuse_lora
      
      * remove space
      
      * rewrite fuse_lora a bit.
      
      * feedback
      
      * copy over load_lora_into_text_encoder.
      
      * address dhruv's feedback.
      
      * fix-copies
      
      * fix issubclass.
      
      * num_fused_loras
      
      * fix
      
      * fix
      
      * remove mapping
      
      * up
      
      * fix
      
      * style
      
      * fix-copies
      
      * change to SD3TransformerLoRALoadersMixin
      
      * Apply suggestions from code review
      Co-authored-by: default avatarDhruv Nair <dhruv.nair@gmail.com>
      
      * up
      
      * handle wuerstchen
      
      * up
      
      * move lora to lora_pipeline.py
      
      * up
      
      * fix-copies
      
      * fix documentation.
      
      * comment set_adapters().
      
      * fix-copies
      
      * fix set_adapters() at the model level.
      
      * fix?
      
      * fix
      
      ---------
      Co-authored-by: default avatarDhruv Nair <dhruv.nair@gmail.com>
      527430d0
  29. 18 Jul, 2024 1 commit
  30. 03 Jul, 2024 2 commits
  31. 26 Jun, 2024 1 commit
  32. 25 Jun, 2024 1 commit
  33. 22 Jun, 2024 1 commit
  34. 21 Jun, 2024 1 commit
  35. 20 Jun, 2024 1 commit