1. 29 Nov, 2023 1 commit
  2. 27 Nov, 2023 1 commit
  3. 10 Nov, 2023 1 commit
  4. 06 Nov, 2023 1 commit
  5. 05 Oct, 2023 1 commit
  6. 26 Sep, 2023 1 commit
  7. 20 Sep, 2023 1 commit
  8. 14 Sep, 2023 1 commit
  9. 08 Sep, 2023 1 commit
  10. 17 Aug, 2023 1 commit
  11. 12 Aug, 2023 1 commit
  12. 04 Aug, 2023 1 commit
  13. 28 Jul, 2023 2 commits
  14. 27 Jul, 2023 1 commit
  15. 26 Jul, 2023 2 commits
  16. 25 Jul, 2023 1 commit
    • Sayak Paul's avatar
      [ControlNet SDXL training] fixes in the training script (#4223) · fed12376
      Sayak Paul authored
      * fix: #4206
      
      * add: sdxl controlnet training smoketest.
      
      * remove unnecessary token inits.
      
      * add: licensing to model card.
      
      * include SDXL licensing in the model card and make public visibility default
      
      * debugging
      
      * debugging
      
      * disable local file download.
      
      * fix: training test.
      
      * fix: ckpt prefix.
      fed12376
  17. 21 Jul, 2023 2 commits
  18. 18 Jul, 2023 1 commit
    • Sayak Paul's avatar
      [Core] add: controlnet support for SDXL (#4038) · 3eb498e7
      Sayak Paul authored
      * add: controlnet sdxl.
      
      * modifications to controlnet.
      
      * run styling.
      
      * add: __init__.pys
      
      * incorporate https://github.com/huggingface/diffusers/pull/4019
      
       changes.
      
      * run make fix-copies.
      
      * resize the conditioning images.
      
      * remove autocast.
      
      * run styling.
      
      * disable autocast.
      
      * debugging
      
      * device placement.
      
      * back to autocast.
      
      * remove comment.
      
      * save some memory by reusing the vae and unet in the pipeline.
      
      * apply styling.
      
      * Allow low precision sd xl
      
      * finish
      
      * finish
      
      * changes to accommodate the improved VAE.
      
      * modifications to how we handle vae encoding in the training.
      
      * make style
      
      * make existing controlnet fast tests pass.
      
      * change vae checkpoint cli arg.
      
      * fix: vae pretrained paths.
      
      * fix: steps in get_scheduler().
      
      * debugging.
      
      * debugging./
      
      * fix: weight conversion.
      
      * add: docs.
      
      * add: limited tests./
      
      * add: datasets to the requirements.
      
      * update docstrings and incorporate the usage of watermarking.
      
      * incorporate fix from #4083
      
      * fix watermarking dependency handling.
      
      * run make-fix-copies.
      
      * Empty-Commit
      
      * Update requirements_sdxl.txt
      
      * remove vae upcasting part.
      
      * Apply suggestions from code review
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      
      * run make style
      
      * run make fix-copies.
      
      * disable suppot for multicontrolnet.
      
      * Apply suggestions from code review
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      
      * run make fix-copies.
      
      * dtyle/.
      
      * fix-copies.
      
      ---------
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      3eb498e7
  19. 13 Jul, 2023 1 commit
  20. 11 Jul, 2023 1 commit
  21. 15 Jun, 2023 1 commit
  22. 08 Jun, 2023 2 commits
  23. 22 May, 2023 1 commit
  24. 27 Apr, 2023 1 commit
  25. 26 Apr, 2023 3 commits
  26. 19 Apr, 2023 1 commit
    • Will Berman's avatar
      controlnet training resize inputs to multiple of 8 (#3135) · 7e6886f5
      Will Berman authored
      controlnet training center crop input images to multiple of 8
      
      The pipeline code resizes inputs to multiples of 8.
      Not doing this resizing in the training script is causing
      the encoded image to have different height/width dimensions
      than the encoded conditioning image (which uses a separate
      encoder that's part of the controlnet model).
      
      We resize and center crop the inputs to make sure they're the
      same size (as well as all other images in the batch). We also
      check that the initial resolution is a multiple of 8.
      7e6886f5
  27. 18 Apr, 2023 3 commits
  28. 12 Apr, 2023 4 commits
    • Andreas Steiner's avatar
      Adds profiling flags, computes train metrics average. (#3053) · d06e0694
      Andreas Steiner authored
      * WIP controlnet training
      
      - bugfix --streaming
      - bugfix running report_to!='wandb'
      - adds memory profile before validation
      
      * Adds final logging statement.
      
      * Sets train epochs to 11.
      
      Looking at a longer ~16ep run, we see only good validation images
      after ~11ep:
      
      https://wandb.ai/andsteing/controlnet_fill50k/runs/3j2hx6n8
      
      
      
      * Removes --logging_dir (it's not used).
      
      * Adds --profile flags.
      
      * Updates --output_dir=runs/fill-circle-{timestamp}.
      
      * Compute mean of `train_metrics`.
      
      Previously `train_metrics[-1]` was logged, resulting in very bumpy train
      metrics.
      
      * Improves logging a bit.
      
      - adds l2_grads gradient norm logging
      - adds steps_per_sec
      - sets walltime as x coordinate of train/step
      - logs controlnet_params config
      
      * Adds --ccache (doesn't really help though).
      
      * minor fix in controlnet flax example (#2986)
      
      * fix the error when push_to_hub but not log validation
      
      * contronet_from_pt & controlnet_revision
      
      * add intermediate checkpointing to the guide
      
      * Bugfix --profile_steps
      
      * Sets `RACKER_PROJECT_NAME='controlnet_fill50k'`.
      
      * Logs fractional epoch.
      
      * Adds relative `walltime` metric.
      
      * Adds `StepTraceAnnotation` and uses `global_step` insetad of `step`.
      
      * Applied `black`.
      
      * Streamlines commands in README a bit.
      
      * Removes `--ccache`.
      
      This makes only a very small difference (~1 min) with this model size, so removing
      the option introduced in cdb3cc.
      
      * Re-ran `black`.
      
      * Update examples/controlnet/README.md
      Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
      
      * Converts spaces to tab.
      
      * Removes repeated args.
      
      * Skips first step (compilation) in profiling
      
      * Updates README with profiling instructions.
      
      * Unifies tabs/spaces in README.
      
      * Re-ran style & quality.
      
      ---------
      Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
      d06e0694
    • Patrick von Platen's avatar
      [Post release] v0.16.0dev (#3072) · 0a73b4d3
      Patrick von Platen authored
      0a73b4d3
    • Patrick von Platen's avatar
      Release: v0.15.0 · e7534542
      Patrick von Platen authored
      e7534542
    • Sayak Paul's avatar
      [Examples] Fix type-casting issue in the ControlNet training script (#2994) · e607a582
      Sayak Paul authored
      * fix: norm group test for UNet3D.
      
      * fix: type-casting issue in controlnet training.
      e607a582
  29. 11 Apr, 2023 1 commit
    • Chanchana Sornsoontorn's avatar
      Fix typo and format BasicTransformerBlock attributes (#2953) · 52c4d32d
      Chanchana Sornsoontorn authored
      * ️chore(train_controlnet) fix typo in logger message
      
      * ️chore(models) refactor modules order; make them the same as calling order
      
      When printing the BasicTransformerBlock to stdout, I think it's crucial that the attributes order are shown in proper order. And also previously the "3. Feed Forward" comment was not making sense. It should have been close to self.ff but it's instead next to self.norm3
      
      * correct many tests
      
      * remove bogus file
      
      * make style
      
      * correct more tests
      
      * finish tests
      
      * fix one more
      
      * make style
      
      * make unclip deterministic
      
      * 
      
      ️chore(models/attention) reorganize comments in BasicTransformerBlock class
      
      ---------
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      52c4d32d