"docs/source/vscode:/vscode.git/clone" did not exist on "ee9093f599f18794158d50144d32e445c053cb8c"
  • Sayak Paul's avatar
    [LoRA] feat: add lora attention processor for pt 2.0. (#3594) · 8669e831
    Sayak Paul authored
    * feat: add lora attention processor for pt 2.0.
    
    * explicit context manager for SDPA.
    
    * switch to flash attention
    
    * make shapes compatible to work optimally with SDPA.
    
    * fix: circular import problem.
    
    * explicitly specify the flash attention kernel in sdpa
    
    * fall back to efficient attention context manager.
    
    * remove explicit dispatch.
    
    * fix: removed processor.
    
    * fix: remove optional from type annotation.
    
    * feat: make changes regarding LoRAAttnProcessor2_0.
    
    * remove confusing warning.
    
    * formatting.
    
    * relax tolerance for PT 2.0
    
    * fix: loading message.
    
    * remove unnecessary logging.
    
    * add: entry to the docs.
    
    * add: network_alpha argument.
    
    * relax tolerance.
    8669e831
attention_processor.py 68.4 KB