1. 06 Jun, 2023 3 commits
    • Sayak Paul's avatar
      [LoRA] feat: add lora attention processor for pt 2.0. (#3594) · 8669e831
      Sayak Paul authored
      * feat: add lora attention processor for pt 2.0.
      
      * explicit context manager for SDPA.
      
      * switch to flash attention
      
      * make shapes compatible to work optimally with SDPA.
      
      * fix: circular import problem.
      
      * explicitly specify the flash attention kernel in sdpa
      
      * fall back to efficient attention context manager.
      
      * remove explicit dispatch.
      
      * fix: removed processor.
      
      * fix: remove optional from type annotation.
      
      * feat: make changes regarding LoRAAttnProcessor2_0.
      
      * remove confusing warning.
      
      * formatting.
      
      * relax tolerance for PT 2.0
      
      * fix: loading message.
      
      * remove unnecessary logging.
      
      * add: entry to the docs.
      
      * add: network_alpha argument.
      
      * relax tolerance.
      8669e831
    • Takuma Mori's avatar
      Add function to remove monkey-patch for text encoder LoRA (#3649) · b45204ea
      Takuma Mori authored
      * merge undoable-monkeypatch
      
      * remove TEXT_ENCODER_TARGET_MODULES, refactoring
      
      * move create_lora_weight_file
      b45204ea
    • Steven Liu's avatar
      [docs] Fix link to loader method (#3680) · a8b0f42c
      Steven Liu authored
      fix link to load_lora_weights
      a8b0f42c
  2. 05 Jun, 2023 12 commits
  3. 02 Jun, 2023 11 commits
  4. 31 May, 2023 3 commits
  5. 30 May, 2023 11 commits