"...git@developer.sourcefind.cn:chenpangpang/transformers.git" did not exist on "3ed3e3190c0d6503a89971fd9744429694522484"
Add warning for missing attention mask when pad tokens are detected (#25345)
* Add attention mask and pad token warning to many of the models * Remove changes under examples/research_projects These files are not maintained by HG. * Skip the warning check during torch.fx or JIT tracing * Switch ordering for the warning and input shape assignment This ordering is a little cleaner for some of the cases. * Add missing line break in one of the files
Showing
Please register or sign in to comment