1. 20 Jan, 2023 1 commit
  2. 21 Dec, 2022 1 commit
  3. 13 Dec, 2022 1 commit
  4. 05 Dec, 2022 1 commit
  5. 21 Nov, 2022 1 commit
    • Ali Hassani's avatar
      Add LayerScale to NAT/DiNAT (#20325) · 11f3ec72
      Ali Hassani authored
      
      
      * Add LayerScale to NAT/DiNAT.
      
      Completely dropped the ball on LayerScale in the original PR (#20219).
      This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
      
      * Add LayerScale to NAT/DiNAT.
      
      Minor error fixed.
      Co-authored-by: default avatarAli Hassani <ahassanijr@gmail.com>
      11f3ec72
  6. 18 Nov, 2022 1 commit
    • Ali Hassani's avatar
      Add Neighborhood Attention Transformer (NAT) and Dilated NAT (DiNAT) models (#20219) · fc4a993e
      Ali Hassani authored
      * Add DiNAT
      
      * Adds DiNAT + tests
      
      * Minor fixes
      
      * Added HF model
      
      * Add natten to dependencies.
      
      * Cleanup
      
      * Minor fixup
      
      * Reformat
      
      * Optional NATTEN import.
      
      * Reformat & add doc to _toctree
      
      * Reformat (finally)
      
      * Dummy objects for DiNAT
      
      * Add NAT + minor changes
      
      Adds NAT as its own independent model + docs, tests
      Adds NATTEN to ext deps to ensure ci picks it up.
      
      * Remove natten from `all` and `dev-torch` deps, add manual pip install to ci tests
      
      * Minor fixes.
      
      * Fix READMEs.
      
      * Requested changes to docs + minor fixes.
      
      * Requested changes.
      
      * Add NAT/DiNAT tests to layoutlm_job
      
      * Correction to Dinat doc.
      
      * Requested changes.
      fc4a993e