"include/vscode:/vscode.git/clone" did not exist on "b39cbc7fdc618c084bbb2adda2e7b068b9f43869"
  1. 21 Apr, 2025 1 commit
  2. 15 Apr, 2025 1 commit
  3. 09 Apr, 2025 1 commit
  4. 08 Apr, 2025 2 commits
  5. 06 Mar, 2025 1 commit
  6. 04 Mar, 2025 1 commit
  7. 24 Feb, 2025 1 commit
  8. 20 Feb, 2025 1 commit
  9. 06 Feb, 2025 1 commit
    • Leo Jiang's avatar
      [bugfix] NPU Adaption for Sana (#10724) · cd0a4a82
      Leo Jiang authored
      
      
      * NPU Adaption for Sanna
      
      * NPU Adaption for Sanna
      
      * NPU Adaption for Sanna
      
      * NPU Adaption for Sanna
      
      * NPU Adaption for Sanna
      
      * NPU Adaption for Sanna
      
      * NPU Adaption for Sanna
      
      * NPU Adaption for Sanna
      
      * NPU Adaption for Sanna
      
      * NPU Adaption for Sanna
      
      * NPU Adaption for Sanna
      
      * NPU Adaption for Sanna
      
      * NPU Adaption for Sanna
      
      * NPU Adaption for Sanna
      
      * NPU Adaption for Sanna
      
      * NPU Adaption for Sanna
      
      * NPU Adaption for Sanna
      
      * NPU Adaption for Sanna
      
      * [bugfix]NPU Adaption for Sanna
      
      ---------
      Co-authored-by: default avatarJ石页 <jiangshuo9@h-partners.com>
      Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
      cd0a4a82
  10. 27 Jan, 2025 1 commit
  11. 24 Jan, 2025 1 commit
  12. 21 Jan, 2025 3 commits
  13. 15 Jan, 2025 1 commit
  14. 30 Dec, 2024 1 commit
  15. 23 Dec, 2024 2 commits
  16. 19 Dec, 2024 1 commit
  17. 18 Dec, 2024 2 commits
    • Sayak Paul's avatar
      [chore] fix: reamde -> readme (#10276) · 63cdf9c0
      Sayak Paul authored
      fix: reamde -> readme
      63cdf9c0
    • Sayak Paul's avatar
      [LoRA] feat: lora support for SANA. (#10234) · 9408aa2d
      Sayak Paul authored
      
      
      * feat: lora support for SANA.
      
      * make fix-copies
      
      * rename test class.
      
      * attention_kwargs -> cross_attention_kwargs.
      
      * Revert "attention_kwargs -> cross_attention_kwargs."
      
      This reverts commit 23433bf9bccc12e0f2f55df26bae58a894e8b43b.
      
      * exhaust 119 max line limit
      
      * sana lora fine-tuning script.
      
      * readme
      
      * add a note about the supported models.
      
      * Apply suggestions from code review
      Co-authored-by: default avatarAryan <aryan@huggingface.co>
      
      * style
      
      * docs for attention_kwargs.
      
      * remove lora_scale from pag pipeline.
      
      * copy fix
      
      ---------
      Co-authored-by: default avatarAryan <aryan@huggingface.co>
      9408aa2d
  18. 12 Dec, 2024 1 commit
  19. 25 Nov, 2024 1 commit
  20. 24 Nov, 2024 1 commit
  21. 19 Nov, 2024 1 commit
  22. 08 Nov, 2024 1 commit
  23. 06 Nov, 2024 1 commit
  24. 01 Nov, 2024 3 commits
  25. 31 Oct, 2024 1 commit
  26. 28 Oct, 2024 3 commits
  27. 25 Oct, 2024 1 commit
  28. 23 Oct, 2024 1 commit
  29. 22 Oct, 2024 1 commit
  30. 16 Oct, 2024 1 commit
  31. 15 Oct, 2024 1 commit