1. 08 Jun, 2023 2 commits
  2. 06 Jun, 2023 1 commit
    • Sayak Paul's avatar
      [LoRA] feat: add lora attention processor for pt 2.0. (#3594) · 8669e831
      Sayak Paul authored
      * feat: add lora attention processor for pt 2.0.
      
      * explicit context manager for SDPA.
      
      * switch to flash attention
      
      * make shapes compatible to work optimally with SDPA.
      
      * fix: circular import problem.
      
      * explicitly specify the flash attention kernel in sdpa
      
      * fall back to efficient attention context manager.
      
      * remove explicit dispatch.
      
      * fix: removed processor.
      
      * fix: remove optional from type annotation.
      
      * feat: make changes regarding LoRAAttnProcessor2_0.
      
      * remove confusing warning.
      
      * formatting.
      
      * relax tolerance for PT 2.0
      
      * fix: loading message.
      
      * remove unnecessary logging.
      
      * add: entry to the docs.
      
      * add: network_alpha argument.
      
      * relax tolerance.
      8669e831
  3. 05 Jun, 2023 2 commits
  4. 02 Jun, 2023 2 commits
    • Will Berman's avatar
      dreambooth if docs - stage II, more info (#3628) · 5911a3aa
      Will Berman authored
      
      
      * dreambooth if docs - stage II, more info
      
      * Update docs/source/en/training/dreambooth.mdx
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      
      * Update docs/source/en/training/dreambooth.mdx
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      
      * Update docs/source/en/training/dreambooth.mdx
      Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
      
      * download instructions for downsized images
      
      * update source README to match docs
      
      ---------
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
      5911a3aa
    • Takuma Mori's avatar
      Support Kohya-ss style LoRA file format (in a limited capacity) (#3437) · 8e552bb4
      Takuma Mori authored
      
      
      * add _convert_kohya_lora_to_diffusers
      
      * make style
      
      * add scaffold
      
      * match result: unet attention only
      
      * fix monkey-patch for text_encoder
      
      * with CLIPAttention
      
      While the terrible images are no longer produced,
      the results do not match those from the hook ver.
      This may be due to not setting the network_alpha value.
      
      * add to support network_alpha
      
      * generate diff image
      
      * fix monkey-patch for text_encoder
      
      * add test_text_encoder_lora_monkey_patch()
      
      * verify that it's okay to release the attn_procs
      
      * fix closure version
      
      * add comment
      
      * Revert "fix monkey-patch for text_encoder"
      
      This reverts commit bb9c61e6faecc1935c9c4319c77065837655d616.
      
      * Fix to reuse utility functions
      
      * make LoRAAttnProcessor targets to self_attn
      
      * fix LoRAAttnProcessor target
      
      * make style
      
      * fix split key
      
      * Update src/diffusers/loaders.py
      
      * remove TEXT_ENCODER_TARGET_MODULES loop
      
      * add print memory usage
      
      * remove test_kohya_loras_scaffold.py
      
      * add: doc on LoRA civitai
      
      * remove print statement and refactor in the doc.
      
      * fix state_dict test for kohya-ss style lora
      
      * Apply suggestions from code review
      Co-authored-by: default avatarTakuma Mori <takuma104@gmail.com>
      
      ---------
      Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
      8e552bb4
  5. 31 May, 2023 2 commits
  6. 26 May, 2023 1 commit
  7. 24 May, 2023 1 commit
  8. 22 May, 2023 2 commits
    • Patrick von Platen's avatar
      make style · 2b56e8ca
      Patrick von Platen authored
      2b56e8ca
    • Ambrosiussen's avatar
      DataLoader respecting EXIF data in Training Images (#3465) · b8b5daae
      Ambrosiussen authored
      * DataLoader will now bake in any transforms or image manipulations contained in the EXIF
      
      Images may have rotations stored in EXIF. Training using such images will cause those transforms to be ignored while training and thus produce unexpected results
      
      * Fixed the Dataloading EXIF issue in main DreamBooth training as well
      
      * Run make style (black & isort)
      b8b5daae
  9. 19 May, 2023 1 commit
  10. 17 May, 2023 3 commits
  11. 11 May, 2023 1 commit
  12. 09 May, 2023 1 commit
    • Will Berman's avatar
      if dreambooth lora (#3360) · a757b2db
      Will Berman authored
      * update IF stage I pipelines
      
      add fixed variance schedulers and lora loading
      
      * added kv lora attn processor
      
      * allow loading into alternative lora attn processor
      
      * make vae optional
      
      * throw away predicted variance
      
      * allow loading into added kv lora layer
      
      * allow load T5
      
      * allow pre compute text embeddings
      
      * set new variance type in schedulers
      
      * fix copies
      
      * refactor all prompt embedding code
      
      class prompts are now included in pre-encoding code
      max tokenizer length is now configurable
      embedding attention mask is now configurable
      
      * fix for when variance type is not defined on scheduler
      
      * do not pre compute validation prompt if not present
      
      * add example test for if lora dreambooth
      
      * add check for train text encoder and pre compute text embeddings
      a757b2db
  13. 03 May, 2023 1 commit
  14. 28 Apr, 2023 2 commits
  15. 26 Apr, 2023 2 commits
  16. 22 Apr, 2023 1 commit
  17. 20 Apr, 2023 1 commit
  18. 12 Apr, 2023 2 commits
  19. 11 Apr, 2023 1 commit
  20. 04 Apr, 2023 1 commit
  21. 29 Mar, 2023 1 commit
  22. 23 Mar, 2023 1 commit
  23. 15 Mar, 2023 1 commit
  24. 14 Mar, 2023 1 commit
  25. 10 Mar, 2023 1 commit
  26. 06 Mar, 2023 1 commit
  27. 03 Mar, 2023 2 commits
  28. 01 Mar, 2023 1 commit
  29. 17 Feb, 2023 1 commit