1. 22 Apr, 2024 1 commit
  2. 29 Mar, 2024 1 commit
    • UmerHA's avatar
      Implements Blockwise lora (#7352) · 03024468
      UmerHA authored
      
      
      * Initial commit
      
      * Implemented block lora
      
      - implemented block lora
      - updated docs
      - added tests
      
      * Finishing up
      
      * Reverted unrelated changes made by make style
      
      * Fixed typo
      
      * Fixed bug + Made text_encoder_2 scalable
      
      * Integrated some review feedback
      
      * Incorporated review feedback
      
      * Fix tests
      
      * Made every module configurable
      
      * Adapter to new lora test structure
      
      * Final cleanup
      
      * Some more final fixes
      
      - Included examples in `using_peft_for_inference.md`
      - Added hint that only attns are scaled
      - Removed NoneTypes
      - Added test to check mismatching lens of adapter names / weights raise error
      
      * Update using_peft_for_inference.md
      
      * Update using_peft_for_inference.md
      
      * Make style, quality, fix-copies
      
      * Updated tutorial;Warning if scale/adapter mismatch
      
      * floats are forwarded as-is; changed tutorial scale
      
      * make style, quality, fix-copies
      
      * Fixed typo in tutorial
      
      * Moved some warnings into `lora_loader_utils.py`
      
      * Moved scale/lora mismatch warnings back
      
      * Integrated final review suggestions
      
      * Empty commit to trigger CI
      
      * Reverted emoty commit to trigger CI
      
      ---------
      Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
      03024468
  3. 21 Mar, 2024 1 commit
  4. 07 Mar, 2024 1 commit
  5. 04 Mar, 2024 1 commit
  6. 14 Feb, 2024 1 commit
  7. 08 Feb, 2024 1 commit
  8. 31 Jan, 2024 1 commit
  9. 09 Jan, 2024 1 commit
  10. 04 Jan, 2024 1 commit
  11. 31 Dec, 2023 1 commit
  12. 29 Dec, 2023 1 commit
  13. 28 Dec, 2023 1 commit
  14. 26 Dec, 2023 2 commits
  15. 20 Nov, 2023 1 commit
    • M. Tolga Cangöz's avatar
      Revert "[`Docs`] Update and make improvements" (#5858) · c72a1739
      M. Tolga Cangöz authored
      * Revert "[`Docs`] Update and make improvements (#5819)"
      
      This reverts commit c697f524.
      
      * Update README.md
      
      * Update memory.md
      
      * Update basic_training.md
      
      * Update write_own_pipeline.md
      
      * Update fp16.md
      
      * Update basic_training.md
      
      * Update write_own_pipeline.md
      
      * Update write_own_pipeline.md
      c72a1739
  16. 16 Nov, 2023 1 commit
  17. 15 Nov, 2023 1 commit
  18. 08 Nov, 2023 1 commit
  19. 01 Nov, 2023 1 commit
  20. 17 Oct, 2023 1 commit
  21. 16 Oct, 2023 2 commits
  22. 12 Aug, 2023 1 commit
  23. 02 Aug, 2023 1 commit
  24. 26 Jul, 2023 1 commit
  25. 03 Jul, 2023 1 commit
  26. 14 Jun, 2023 1 commit
  27. 26 May, 2023 1 commit
  28. 11 Apr, 2023 1 commit
    • Patrick von Platen's avatar
      Fix config prints and save, load of pipelines (#2849) · 8b451eb6
      Patrick von Platen authored
      * [Config] Fix config prints and save, load
      
      * Only use potential nn.Modules for dtype and device
      
      * Correct vae image processor
      
      * make sure in_channels is not accessed directly
      
      * make sure in channels is only accessed via config
      
      * Make sure schedulers only access config attributes
      
      * Make sure to access config in SAG
      
      * Fix vae processor and make style
      
      * add tests
      
      * uP
      
      * make style
      
      * Fix more naming issues
      
      * Final fix with vae config
      
      * change more
      8b451eb6
  29. 15 Mar, 2023 1 commit
  30. 10 Mar, 2023 1 commit
  31. 03 Mar, 2023 1 commit
    • Steven Liu's avatar
      Training tutorial (#2473) · fa6d52d5
      Steven Liu authored
      * first draft
      
      *  minor edits
      
      *  minor fixes
      
      * 🖍 apply feedbacks
      
      * 🖍 apply feedback and minor edits
      fa6d52d5