1. 09 Jul, 2023 2 commits
    • Patrick von Platen's avatar
      make style · 4a3e5748
      Patrick von Platen authored
      4a3e5748
    • Will Berman's avatar
      Refactor LoRA (#3778) · c2a28c34
      Will Berman authored
      
      
      * refactor to support patching LoRA into T5
      
      instantiate the lora linear layer on the same device as the regular linear layer
      
      get lora rank from state dict
      
      tests
      
      fmt
      
      can create lora layer in float32 even when rest of model is float16
      
      fix loading model hook
      
      remove load_lora_weights_ and T5 dispatching
      
      remove Unet#attn_processors_state_dict
      
      docstrings
      
      * text encoder monkeypatch class method
      
      * fix test
      
      ---------
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      c2a28c34
  2. 07 Jul, 2023 1 commit
  3. 06 Jul, 2023 1 commit
  4. 04 Jul, 2023 1 commit
  5. 03 Jul, 2023 1 commit
    • Andrés Mauricio Repetto Ferrero's avatar
      Adding better way to define multiple concepts and also validation capabilities. (#3807) · 572d8e20
      Andrés Mauricio Repetto Ferrero authored
      
      
      * - Added validation parameters
      - Changed some parameter descriptions to better explain their use.
      - Fixed a few typos.
      - Added concept_list parameter for better management of multiple subjects
      - changed logic for image validation
      
      * - Fixed bad logic for class data root directories
      
      * Defaulting validation_steps to None for an easier logic
      
      * Fixed multiple validation prompts
      
      * Fixed bug on validation negative prompt
      
      * Changed validation logic for tracker.
      
      * Added uuid for validation image labeling
      
      * Fix error when comparing validation prompts and validation negative prompts
      
      * Improved error message when negative prompts for validation are more than the number of prompts
      
      * - Changed image tracking number from epoch to global_step
      - Added Typing for functions
      
      * Added some validations more when using concept_list parameter and the regular ones.
      
      * Fixed error message
      
      * Added more validations for validation parameters
      
      * Improved messaging for errors
      
      * Fixed validation error for parameters with default values
      
      * - Added train step to image name for validation
      - reformatted code
      
      * - Added train step to image's name for validation
      - reformatted code
      
      * Updated README.md file.
      
      * reverted back original script of train_dreambooth.py
      
      * reverted back original script of train_dreambooth.py
      
      * left one blank line at the eof
      
      * reverted back setup.py
      
      * reverted back setup.py
      
      * added same logic for when parameters for prior preservation are used without enabling the flag while using concept_list parameter.
      
      * Ran black formatter.
      
      * fixed a few strings
      
      * fixed import sort with isort and removed fstrings without placeholder
      
      * fixed import order with ruff (since with isort wasn't ok)
      
      ---------
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      572d8e20
  6. 29 Jun, 2023 1 commit
  7. 20 Jun, 2023 1 commit
  8. 16 Jun, 2023 1 commit
  9. 15 Jun, 2023 4 commits
  10. 08 Jun, 2023 2 commits
  11. 07 Jun, 2023 3 commits
  12. 06 Jun, 2023 1 commit
    • Sayak Paul's avatar
      [LoRA] feat: add lora attention processor for pt 2.0. (#3594) · 8669e831
      Sayak Paul authored
      * feat: add lora attention processor for pt 2.0.
      
      * explicit context manager for SDPA.
      
      * switch to flash attention
      
      * make shapes compatible to work optimally with SDPA.
      
      * fix: circular import problem.
      
      * explicitly specify the flash attention kernel in sdpa
      
      * fall back to efficient attention context manager.
      
      * remove explicit dispatch.
      
      * fix: removed processor.
      
      * fix: remove optional from type annotation.
      
      * feat: make changes regarding LoRAAttnProcessor2_0.
      
      * remove confusing warning.
      
      * formatting.
      
      * relax tolerance for PT 2.0
      
      * fix: loading message.
      
      * remove unnecessary logging.
      
      * add: entry to the docs.
      
      * add: network_alpha argument.
      
      * relax tolerance.
      8669e831
  13. 05 Jun, 2023 3 commits
  14. 02 Jun, 2023 7 commits
  15. 31 May, 2023 3 commits
  16. 30 May, 2023 6 commits
  17. 26 May, 2023 1 commit
  18. 24 May, 2023 1 commit