1. 27 Jul, 2023 1 commit
  2. 25 Jul, 2023 2 commits
  3. 21 Jul, 2023 1 commit
  4. 14 Jul, 2023 1 commit
    • Sayak Paul's avatar
      [Feat] add: utility for unloading lora. (#4034) · 692b7a90
      Sayak Paul authored
      * add: test for testing unloading lora.
      
      * add :reason to skipif.
      
      * initial implementation of lora unload().
      
      * apply styling.
      
      * add: doc.
      
      * change checkpoints.
      
      * reinit generator
      
      * finalize slow test.
      
      * add fast test for unloading lora.
      692b7a90
  5. 09 Jul, 2023 1 commit
    • Will Berman's avatar
      Refactor LoRA (#3778) · c2a28c34
      Will Berman authored
      
      
      * refactor to support patching LoRA into T5
      
      instantiate the lora linear layer on the same device as the regular linear layer
      
      get lora rank from state dict
      
      tests
      
      fmt
      
      can create lora layer in float32 even when rest of model is float16
      
      fix loading model hook
      
      remove load_lora_weights_ and T5 dispatching
      
      remove Unet#attn_processors_state_dict
      
      docstrings
      
      * text encoder monkeypatch class method
      
      * fix test
      
      ---------
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      c2a28c34
  6. 06 Jun, 2023 3 commits
    • Patrick von Platen's avatar
      Add draft for lora text encoder scale (#3626) · 74fd735e
      Patrick von Platen authored
      
      
      * Add draft for lora text encoder scale
      
      * Improve naming
      
      * fix: training dreambooth lora script.
      
      * Apply suggestions from code review
      
      * Update examples/dreambooth/train_dreambooth_lora.py
      
      * Apply suggestions from code review
      
      * Apply suggestions from code review
      
      * add lora mixin when fit
      
      * add lora mixin when fit
      
      * add lora mixin when fit
      
      * fix more
      
      * fix more
      
      ---------
      Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
      74fd735e
    • Sayak Paul's avatar
      [LoRA] feat: add lora attention processor for pt 2.0. (#3594) · 8669e831
      Sayak Paul authored
      * feat: add lora attention processor for pt 2.0.
      
      * explicit context manager for SDPA.
      
      * switch to flash attention
      
      * make shapes compatible to work optimally with SDPA.
      
      * fix: circular import problem.
      
      * explicitly specify the flash attention kernel in sdpa
      
      * fall back to efficient attention context manager.
      
      * remove explicit dispatch.
      
      * fix: removed processor.
      
      * fix: remove optional from type annotation.
      
      * feat: make changes regarding LoRAAttnProcessor2_0.
      
      * remove confusing warning.
      
      * formatting.
      
      * relax tolerance for PT 2.0
      
      * fix: loading message.
      
      * remove unnecessary logging.
      
      * add: entry to the docs.
      
      * add: network_alpha argument.
      
      * relax tolerance.
      8669e831
    • Takuma Mori's avatar
      Add function to remove monkey-patch for text encoder LoRA (#3649) · b45204ea
      Takuma Mori authored
      * merge undoable-monkeypatch
      
      * remove TEXT_ENCODER_TARGET_MODULES, refactoring
      
      * move create_lora_weight_file
      b45204ea
  7. 02 Jun, 2023 1 commit
    • Takuma Mori's avatar
      Support Kohya-ss style LoRA file format (in a limited capacity) (#3437) · 8e552bb4
      Takuma Mori authored
      
      
      * add _convert_kohya_lora_to_diffusers
      
      * make style
      
      * add scaffold
      
      * match result: unet attention only
      
      * fix monkey-patch for text_encoder
      
      * with CLIPAttention
      
      While the terrible images are no longer produced,
      the results do not match those from the hook ver.
      This may be due to not setting the network_alpha value.
      
      * add to support network_alpha
      
      * generate diff image
      
      * fix monkey-patch for text_encoder
      
      * add test_text_encoder_lora_monkey_patch()
      
      * verify that it's okay to release the attn_procs
      
      * fix closure version
      
      * add comment
      
      * Revert "fix monkey-patch for text_encoder"
      
      This reverts commit bb9c61e6faecc1935c9c4319c77065837655d616.
      
      * Fix to reuse utility functions
      
      * make LoRAAttnProcessor targets to self_attn
      
      * fix LoRAAttnProcessor target
      
      * make style
      
      * fix split key
      
      * Update src/diffusers/loaders.py
      
      * remove TEXT_ENCODER_TARGET_MODULES loop
      
      * add print memory usage
      
      * remove test_kohya_loras_scaffold.py
      
      * add: doc on LoRA civitai
      
      * remove print statement and refactor in the doc.
      
      * fix state_dict test for kohya-ss style lora
      
      * Apply suggestions from code review
      Co-authored-by: default avatarTakuma Mori <takuma104@gmail.com>
      
      ---------
      Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
      8e552bb4
  8. 26 May, 2023 1 commit
  9. 20 Apr, 2023 1 commit
  10. 17 Apr, 2023 1 commit
  11. 13 Apr, 2023 1 commit
  12. 12 Apr, 2023 1 commit
    • Sayak Paul's avatar
      [LoRA] Enabling limited LoRA support for text encoder (#2918) · a89a14fa
      Sayak Paul authored
      * add: first draft for a better LoRA enabler.
      
      * make fix-copies.
      
      * feat: backward compatibility.
      
      * add: entry to the docs.
      
      * add: tests.
      
      * fix: docs.
      
      * fix: norm group test for UNet3D.
      
      * feat: add support for flat dicts.
      
      * add depcrcation message instead of warning.
      a89a14fa