1. 15 Jun, 2023 1 commit
  2. 08 Jun, 2023 2 commits
  3. 07 Jun, 2023 3 commits
  4. 06 Jun, 2023 1 commit
    • Sayak Paul's avatar
      [LoRA] feat: add lora attention processor for pt 2.0. (#3594) · 8669e831
      Sayak Paul authored
      * feat: add lora attention processor for pt 2.0.
      
      * explicit context manager for SDPA.
      
      * switch to flash attention
      
      * make shapes compatible to work optimally with SDPA.
      
      * fix: circular import problem.
      
      * explicitly specify the flash attention kernel in sdpa
      
      * fall back to efficient attention context manager.
      
      * remove explicit dispatch.
      
      * fix: removed processor.
      
      * fix: remove optional from type annotation.
      
      * feat: make changes regarding LoRAAttnProcessor2_0.
      
      * remove confusing warning.
      
      * formatting.
      
      * relax tolerance for PT 2.0
      
      * fix: loading message.
      
      * remove unnecessary logging.
      
      * add: entry to the docs.
      
      * add: network_alpha argument.
      
      * relax tolerance.
      8669e831
  5. 05 Jun, 2023 3 commits
  6. 02 Jun, 2023 7 commits
  7. 31 May, 2023 3 commits
  8. 30 May, 2023 6 commits
  9. 26 May, 2023 1 commit
  10. 24 May, 2023 1 commit
  11. 23 May, 2023 2 commits
  12. 22 May, 2023 4 commits
  13. 19 May, 2023 1 commit
  14. 17 May, 2023 4 commits
  15. 16 May, 2023 1 commit