[LoRA] fix `cross_attention_kwargs` problems and tighten tests (#7388)
* debugging * let's see the numbers * let's see the numbers * let's see the numbers * restrict tolerance. * increase inference steps. * shallow copy of cross_attentionkwargs * remove print
Showing
Please register or sign in to comment