1. 14 Jul, 2023 1 commit
    • Sayak Paul's avatar
      [Feat] add: utility for unloading lora. (#4034) · 692b7a90
      Sayak Paul authored
      * add: test for testing unloading lora.
      
      * add :reason to skipif.
      
      * initial implementation of lora unload().
      
      * apply styling.
      
      * add: doc.
      
      * change checkpoints.
      
      * reinit generator
      
      * finalize slow test.
      
      * add fast test for unloading lora.
      692b7a90
  2. 14 Jun, 2023 1 commit
  3. 06 Jun, 2023 1 commit
    • Patrick von Platen's avatar
      Add draft for lora text encoder scale (#3626) · 74fd735e
      Patrick von Platen authored
      
      
      * Add draft for lora text encoder scale
      
      * Improve naming
      
      * fix: training dreambooth lora script.
      
      * Apply suggestions from code review
      
      * Update examples/dreambooth/train_dreambooth_lora.py
      
      * Apply suggestions from code review
      
      * Apply suggestions from code review
      
      * add lora mixin when fit
      
      * add lora mixin when fit
      
      * add lora mixin when fit
      
      * fix more
      
      * fix more
      
      ---------
      Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
      74fd735e
  4. 02 Jun, 2023 1 commit
    • Takuma Mori's avatar
      Support Kohya-ss style LoRA file format (in a limited capacity) (#3437) · 8e552bb4
      Takuma Mori authored
      
      
      * add _convert_kohya_lora_to_diffusers
      
      * make style
      
      * add scaffold
      
      * match result: unet attention only
      
      * fix monkey-patch for text_encoder
      
      * with CLIPAttention
      
      While the terrible images are no longer produced,
      the results do not match those from the hook ver.
      This may be due to not setting the network_alpha value.
      
      * add to support network_alpha
      
      * generate diff image
      
      * fix monkey-patch for text_encoder
      
      * add test_text_encoder_lora_monkey_patch()
      
      * verify that it's okay to release the attn_procs
      
      * fix closure version
      
      * add comment
      
      * Revert "fix monkey-patch for text_encoder"
      
      This reverts commit bb9c61e6faecc1935c9c4319c77065837655d616.
      
      * Fix to reuse utility functions
      
      * make LoRAAttnProcessor targets to self_attn
      
      * fix LoRAAttnProcessor target
      
      * make style
      
      * fix split key
      
      * Update src/diffusers/loaders.py
      
      * remove TEXT_ENCODER_TARGET_MODULES loop
      
      * add print memory usage
      
      * remove test_kohya_loras_scaffold.py
      
      * add: doc on LoRA civitai
      
      * remove print statement and refactor in the doc.
      
      * fix state_dict test for kohya-ss style lora
      
      * Apply suggestions from code review
      Co-authored-by: default avatarTakuma Mori <takuma104@gmail.com>
      
      ---------
      Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
      8e552bb4
  5. 21 May, 2023 1 commit
  6. 04 May, 2023 1 commit
  7. 03 May, 2023 1 commit
  8. 28 Apr, 2023 1 commit
  9. 21 Apr, 2023 1 commit
  10. 20 Apr, 2023 1 commit
  11. 06 Mar, 2023 1 commit
  12. 30 Jan, 2023 1 commit
  13. 25 Jan, 2023 1 commit