1. 10 Jan, 2025 2 commits
  2. 09 Jan, 2025 2 commits
  3. 08 Jan, 2025 2 commits
  4. 07 Jan, 2025 6 commits
  5. 06 Jan, 2025 1 commit
    • Ameer Azam's avatar
      Regarding the RunwayML path for V1.5 did change to... · 4f5e3e35
      Ameer Azam authored
      Regarding the RunwayML path for V1.5 did change to stable-diffusion-v1-5/[stable-diffusion-v1-5/ stable-diffusion-inpainting] (#10476)
      
      * Update pipeline_controlnet.py
      
      * Update pipeline_controlnet_img2img.py
      
      runwayml Take-down so change all from to this
      stable-diffusion-v1-5/stable-diffusion-v1-5
      
      * Update pipeline_controlnet_inpaint.py
      
      * runwayml take-down make change to sd-legacy
      
      * runwayml take-down make change to sd-legacy
      
      * runwayml take-down make change to sd-legacy
      
      * runwayml take-down make change to sd-legacy
      
      * Update convert_blipdiffusion_to_diffusers.py
      
      style change
      4f5e3e35
  6. 04 Jan, 2025 1 commit
  7. 03 Jan, 2025 1 commit
  8. 02 Jan, 2025 2 commits
  9. 30 Dec, 2024 1 commit
  10. 24 Dec, 2024 1 commit
  11. 23 Dec, 2024 2 commits
  12. 19 Dec, 2024 1 commit
  13. 18 Dec, 2024 3 commits
  14. 17 Dec, 2024 1 commit
  15. 15 Dec, 2024 1 commit
  16. 13 Dec, 2024 1 commit
  17. 12 Dec, 2024 3 commits
    • hlky's avatar
      Remove `negative_*` from SDXL callback (#10203) · f2d348d9
      hlky authored
      * Remove `negative_*` from SDXL callback
      
      * Change example and add XL version
      f2d348d9
    • Sayak Paul's avatar
      [WIP][Training] Flux Control LoRA training script (#10130) · 8170dc36
      Sayak Paul authored
      
      
      * update
      
      * add
      
      * update
      
      * add control-lora conversion script; make flux loader handle norms; fix rank calculation assumption
      
      * control lora updates
      
      * remove copied-from
      
      * create separate pipelines for flux control
      
      * make fix-copies
      
      * update docs
      
      * add tests
      
      * fix
      
      * Apply suggestions from code review
      Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
      
      * remove control lora changes
      
      * apply suggestions from review
      
      * Revert "remove control lora changes"
      
      This reverts commit 73cfc519c9b99b7dc3251cc6a90a5db3056c4819.
      
      * update
      
      * update
      
      * improve log messages
      
      * updates.
      
      * updates
      
      * support register_config.
      
      * fix
      
      * fix
      
      * fix
      
      * updates
      
      * updates
      
      * updates
      
      * fix-copies
      
      * fix
      
      * apply suggestions from review
      
      * add tests
      
      * remove conversion script; enable on-the-fly conversion
      
      * bias -> lora_bias.
      
      * fix-copies
      
      * peft.py
      
      * fix lora conversion
      
      * changes
      Co-authored-by: default avatara-r-r-o-w <contact.aryanvs@gmail.com>
      
      * fix-copies
      
      * updates for tests
      
      * fix
      
      * alpha_pattern.
      
      * add a test for varied lora ranks and alphas.
      
      * revert changes in num_channels_latents = self.transformer.config.in_channels // 8
      
      * revert moe
      
      * add a sanity check on unexpected keys when loading norm layers.
      
      * contro lora.
      
      * fixes
      
      * fixes
      
      * fixes
      
      * tests
      
      * reviewer feedback
      
      * fix
      
      * proper peft version for lora_bias
      
      * fix-copies
      
      * updates
      
      * updates
      
      * updates
      
      * remove debug code
      
      * update docs
      
      * integration tests
      
      * nis
      
      * fuse and unload.
      
      * fix
      
      * add slices.
      
      * more updates.
      
      * button up readme
      
      * train()
      
      * add full fine-tuning version.
      
      * fixes
      
      * Apply suggestions from code review
      Co-authored-by: default avatarAryan <aryan@huggingface.co>
      
      * set_grads_to_none remove.
      
      * readme
      
      ---------
      Co-authored-by: default avatarAryan <aryan@huggingface.co>
      Co-authored-by: default avataryiyixuxu <yixu310@gmail.com>
      Co-authored-by: default avatara-r-r-o-w <contact.aryanvs@gmail.com>
      8170dc36
    • Ethan Smith's avatar
      fix min-snr implementation (#8466) · 26e80e01
      Ethan Smith authored
      * fix min-snr implementation
      
      https://github.com/kohya-ss/sd-scripts/blob/main/library/custom_train_functions.py#L66
      
      
      
      * Update train_dreambooth.py
      
      fix variable name mse_loss_weights
      
      * fix divisor
      
      * make style
      
      ---------
      Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      26e80e01
  18. 10 Dec, 2024 3 commits
    • Linoy Tsaban's avatar
      [community pipeline rf-inversion] - fix example in doc (#10179) · 43534a8d
      Linoy Tsaban authored
      * fix example in doc
      
      * remove redundancies
      
      * change param
      43534a8d
    • hlky's avatar
      Use `torch` in `get_3d_rotary_pos_embed`/`_allegro` (#10161) · 4c4b323c
      hlky authored
      Use torch in get_3d_rotary_pos_embed/_allegro
      4c4b323c
    • Linoy Tsaban's avatar
      [community pipeline] Add RF-inversion Flux pipeline (#9816) · c9e4fab4
      Linoy Tsaban authored
      
      
      * initial commit
      
      * update denoising loop
      
      * fix scheduling
      
      * style
      
      * fix import
      
      * fixes
      
      * fixes
      
      * style
      
      * fixes
      
      * change invert
      
      * change denoising & check inputs
      
      * shape & timesteps fixes
      
      * timesteps fixes
      
      * style
      
      * remove redundancies
      
      * small changes
      
      * update documentation a bit
      
      * update documentation a bit
      
      * update documentation a bit
      
      * style
      
      * change strength param, remove redundancies
      
      * style
      
      * forward ode loop change
      
      * add inversion progress bar
      
      * fix image_seq_len
      
      * revert to strength but == 1 by default.
      
      * style
      
      * add "copied from..." comments
      
      * credit authors
      
      * make style
      
      * return inversion outputs without self-assigning
      
      * adjust denoising loop to generate regular images if inverted latents are not provided
      
      * adjust denoising loop to generate regular images if inverted latents are not provided
      
      * fix import
      
      * comment
      
      * remove redundant line
      
      * modify comment on ti
      
      * Update examples/community/pipeline_flux_rf_inversion.py
      Co-authored-by: default avatarhlky <hlky@hlky.ac>
      
      * Update examples/community/pipeline_flux_rf_inversion.py
      Co-authored-by: default avatarhlky <hlky@hlky.ac>
      
      * Update examples/community/pipeline_flux_rf_inversion.py
      Co-authored-by: default avatarhlky <hlky@hlky.ac>
      
      * Update examples/community/pipeline_flux_rf_inversion.py
      Co-authored-by: default avatarhlky <hlky@hlky.ac>
      
      * Update examples/community/pipeline_flux_rf_inversion.py
      Co-authored-by: default avatarhlky <hlky@hlky.ac>
      
      * Update examples/community/pipeline_flux_rf_inversion.py
      Co-authored-by: default avatarhlky <hlky@hlky.ac>
      
      * Update examples/community/pipeline_flux_rf_inversion.py
      Co-authored-by: default avatarhlky <hlky@hlky.ac>
      
      * fix syntax error
      
      ---------
      Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
      Co-authored-by: default avatarhlky <hlky@hlky.ac>
      c9e4fab4
  19. 06 Dec, 2024 3 commits
  20. 03 Dec, 2024 2 commits
  21. 28 Nov, 2024 1 commit