[⚠ ️ removed a default argument] Make `AttentionMaskConverter` compatible with...
[⚠ ️ removed a default argument] Make `AttentionMaskConverter` compatible with `torch.compile(..., fullgraph=True)` (#27868) * remove bugged torch.float32 default * add test * fix tests * fix test * fix doc
Showing
Please register or sign in to comment