"...git@developer.sourcefind.cn:chenpangpang/transformers.git" did not exist on "3eddda1111f70f3a59485e08540e8262b927e867"
[⚠ ️ removed a default argument] Make `AttentionMaskConverter` compatible with...
[⚠ ️ removed a default argument] Make `AttentionMaskConverter` compatible with `torch.compile(..., fullgraph=True)` (#27868) * remove bugged torch.float32 default * add test * fix tests * fix test * fix doc
Showing
Please register or sign in to comment