1. 11 Aug, 2023 1 commit
  2. 10 Aug, 2023 3 commits
  3. 09 Aug, 2023 2 commits
  4. 08 Aug, 2023 2 commits
    • Wooyeol Baek's avatar
      Copy lora functions to XLPipelines (#4512) · c7c0b575
      Wooyeol Baek authored
      * add load_lora_weights and save_lora_weights to StableDiffusionXLImg2ImgPipeline
      
      * add load_lora_weights and save_lora_weights to StableDiffusionXLInpaintPipeline
      
      * apply black format
      
      * apply black format
      
      * add copy statement
      
      * fix statements
      
      * fix statements
      
      * fix statements
      
      * run `make fix-copies`
      c7c0b575
    • George He's avatar
      Fix misc typos (#4479) · f0725c58
      George He authored
      Fix typos
      f0725c58
  5. 07 Aug, 2023 1 commit
  6. 04 Aug, 2023 3 commits
  7. 03 Aug, 2023 10 commits
  8. 02 Aug, 2023 4 commits
  9. 01 Aug, 2023 4 commits
  10. 31 Jul, 2023 1 commit
    • Nishant Rajadhyaksha's avatar
      Update docs of unet_1d.py (#4394) · 6c49d542
      Nishant Rajadhyaksha authored
      Update unet_1d.py
      
      highlighting the way the modules are actually fed in the main code as the order matters because no skip block attaches time embeds whilst others do not
      6c49d542
  11. 30 Jul, 2023 2 commits
  12. 28 Jul, 2023 5 commits
    • Sayak Paul's avatar
      [Feat] Support SDXL Kohya-style LoRA (#4287) · 4a4cdd6b
      Sayak Paul authored
      
      
      * sdxl lora changes.
      
      * better name replacement.
      
      * better replacement.
      
      * debugging
      
      * debugging
      
      * debugging
      
      * debugging
      
      * debugging
      
      * remove print.
      
      * print state dict keys.
      
      * print
      
      * distingisuih better
      
      * debuggable.
      
      * fxi: tyests
      
      * fix: arg from training script.
      
      * access from class.
      
      * run style
      
      * debug
      
      * save intermediate
      
      * some simplifications for SDXL LoRA
      
      * styling
      
      * unet config is not needed in diffusers format.
      
      * fix: dynamic SGM block mapping for SDXL kohya loras (#4322)
      
      * Use lora compatible layers for linear proj_in/proj_out (#4323)
      
      * improve condition for using the sgm_diffusers mapping
      
      * informative comment.
      
      * load compatible keys and embedding layer maaping.
      
      * Get SDXL 1.0 example lora to load
      
      * simplify
      
      * specif ranks and hidden sizes.
      
      * better handling of k rank and hidden
      
      * debug
      
      * debug
      
      * debug
      
      * debug
      
      * debug
      
      * fix: alpha keys
      
      * add check for handling LoRAAttnAddedKVProcessor
      
      * sanity comment
      
      * modifications for text encoder SDXL
      
      * debugging
      
      * debugging
      
      * debugging
      
      * debugging
      
      * debugging
      
      * debugging
      
      * debugging
      
      * debugging
      
      * denugging
      
      * debugging
      
      * debugging
      
      * debugging
      
      * debugging
      
      * debugging
      
      * debugging
      
      * debugging
      
      * debugging
      
      * debugging
      
      * debugging
      
      * debugging
      
      * debugging
      
      * debugging
      
      * debugging
      
      * debugging
      
      * debugging
      
      * debugging
      
      * debugging
      
      * up
      
      * up
      
      * up
      
      * up
      
      * up
      
      * up
      
      * unneeded comments.
      
      * unneeded comments.
      
      * kwargs for the other attention processors.
      
      * kwargs for the other attention processors.
      
      * debugging
      
      * debugging
      
      * debugging
      
      * debugging
      
      * improve
      
      * debugging
      
      * debugging
      
      * more print
      
      * Fix alphas
      
      * debugging
      
      * debugging
      
      * debugging
      
      * debugging
      
      * debugging
      
      * debugging
      
      * clean up
      
      * clean up.
      
      * debugging
      
      * fix: text
      
      ---------
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      Co-authored-by: default avatarBatuhan Taskaya <batuhan@python.org>
      4a4cdd6b
    • Patrick von Platen's avatar
      [SDXL] Make watermarker optional under certain circumstances to improve... · b7b6d613
      Patrick von Platen authored
      [SDXL] Make watermarker optional under certain circumstances to improve usability of SDXL 1.0 (#4346)
      
      * improve sdxl
      
      * more fixes
      
      * improve sdxl
      
      * improve sdxl
      
      * improve sdxl
      
      * finish
      b7b6d613
    • kathath's avatar
      Fix repeat of negative prompt (#4335) · faa6cbc9
      kathath authored
      fix repeat of negative prompt
      faa6cbc9
    • Patrick von Platen's avatar
      [ONNX] Don't download ONNX model by default (#4338) · 306a7bd0
      Patrick von Platen authored
      * [Download] Don't download ONNX weights by default
      
      * [Download] Don't download ONNX weights by default
      
      * [Download] Don't download ONNX weights by default
      
      * fix more
      
      * finish
      
      * finish
      
      * finish
      306a7bd0
    • Patrick von Platen's avatar
      [SDXL Refiner] Fix refiner forward pass for batched input (#4327) · 18b018c8
      Patrick von Platen authored
      * fix_batch_xl
      
      * Fix other pipelines as well
      
      * up
      
      * up
      
      * Update tests/pipelines/stable_diffusion_xl/test_stable_diffusion_xl_inpaint.py
      
      * sort
      
      * up
      
      * Finish it all up Co-authored-by: Bagheera <bghira@users.github.com>
      
      * Co-authored-by: Bagheera bghira@users.github.com
      
      * Co-authored-by: Bagheera <bghira@users.github.com>
      
      * Finish it all up Co-authored-by: Bagheera <bghira@users.github.com>
      18b018c8
  13. 27 Jul, 2023 2 commits