".github/vscode:/vscode.git/clone" did not exist on "9d3cb277daaff3c42f1d8df5a0c9d9519b42e32a"
  • Sayak Paul's avatar
    [LoRA] feat: add lora attention processor for pt 2.0. (#3594) · 8669e831
    Sayak Paul authored
    * feat: add lora attention processor for pt 2.0.
    
    * explicit context manager for SDPA.
    
    * switch to flash attention
    
    * make shapes compatible to work optimally with SDPA.
    
    * fix: circular import problem.
    
    * explicitly specify the flash attention kernel in sdpa
    
    * fall back to efficient attention context manager.
    
    * remove explicit dispatch.
    
    * fix: removed processor.
    
    * fix: remove optional from type annotation.
    
    * feat: make changes regarding LoRAAttnProcessor2_0.
    
    * remove confusing warning.
    
    * formatting.
    
    * relax tolerance for PT 2.0
    
    * fix: loading message.
    
    * remove unnecessary logging.
    
    * add: entry to the docs.
    
    * add: network_alpha argument.
    
    * relax tolerance.
    8669e831
attnprocessor.mdx 1.24 KB