• Kashif Rasul's avatar
    T5Attention support for cross-attention (#2654) · cf4227cd
    Kashif Rasul authored
    
    
    * fix AttnProcessor2_0
    
    Fix use of AttnProcessor2_0 for cross attention with mask
    
    * added scale_qk and out_bias flags
    
    * fixed for xformers
    
    * check if it has scale argument
    
    * Update cross_attention.py
    
    * check torch version
    
    * fix sliced attn
    
    * style
    
    * set scale
    
    * fix test
    
    * fixed addedKV processor
    
    * revert back AttnProcessor2_0
    
    * if missing if
    
    * fix inner_dim
    
    ---------
    Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
    cf4227cd
cross_attention.py 28.3 KB