• JB (Don)'s avatar
    Add warning for missing attention mask when pad tokens are detected (#25345) · 5ea2595e
    JB (Don) authored
    * Add attention mask and pad token warning to many of the models
    
    * Remove changes under examples/research_projects
    
    These files are not maintained by HG.
    
    * Skip the warning check during torch.fx or JIT tracing
    
    * Switch ordering for the warning and input shape assignment
    
    This ordering is a little cleaner for some of the cases.
    
    * Add missing line break in one of the files
    5ea2595e
modeling_convbert.py 57.4 KB