1. 18 Feb, 2022 4 commits
    • Gunjan Chhablani's avatar
      Add PLBart (#13269) · ae1f8350
      Gunjan Chhablani authored
      * Init PLBART
      
      * Add missing configuration file
      
      * Add conversion script and configurationf ile
      
      * Fix style
      
      * Update modeling and conversion scripts
      
      * Fix scale embedding in config
      
      * Add comment
      
      * Fix conversion script
      
      * Add classification option to conversion script
      
      * Fix vocab size in config doc
      
      * Add tokenizer files from MBart50
      
      * Allow no lang code in regular tokenizer
      
      * Add PLBart Tokenizer Converters
      
      * Remove mask from multi tokenizer
      
      * Remove mask from multi tokenizer
      
      * Change from MBart-50 to MBart tokenizer
      
      * Fix names and modify src/tgt behavior
      
      * Fix imports for tokenizer
      
      * Remove <mask> from multi tokenizer
      
      * Fix style
      
      * Change tokenizer_class to processor_class
      
      * Add attribute map to config class
      
      * Update modeling file to modified MBart code
      
      * Update configuration file to MBart style configuration
      
      * Fix tokenizer
      
      * Separate tokenizers
      
      * Fix error in tokenization auto
      
      * Copy MBart tests
      
      * Replace with MBart tokenization tests
      
      * Fix style
      
      * Fix language code in multi tokenizer
      
      * Fix configuration docs
      
      * Add entry for plbart_multi in transformers init
      
      * Add dummy objects and fix imports
      
      * Fix modeling tests
      
      * Add TODO in config
      
      * Fix copyright year
      
      * Fix modeling docs and test
      
      * Fix some tokenization tests and style
      
      * Add changes from review
      
      * Fix copies
      
      * Fix docs
      
      * Fix docs
      
      * Fix style
      
      * Fix year
      
      * Add changes from review
      
      * Remove extra changes
      
      * Fix base tokenizer and doc
      
      * Fix style
      
      * Fix modeling and slow tokenizer tests
      
      * Remove Multi-tokenizer Converter and Tests
      
      * Delete QA model and Multi Tokenizer dummy objects
      
      * Fix repo consistency and code quality issues
      
      * Fix example documentation
      
      * Fix style
      
      * Remove PLBartTokenizer from type checking in init
      
      * Fix consistency issue
      
      * Add changes from review
      
      * Fix style
      
      * Remove PLBartTokenizerFast
      
      * Remove FastTokenizer converter
      
      * Fix AutoTokenzier mapping
      
      * Add plbart to toctree and fix consistency issues
      
      * Add language codes tokenizer test
      
      * Fix styling and doc issues
      
      * Add fixes for failing tests
      
      * Fix copies
      
      * Fix failing modeling test
      
      * Change assert to assertTrue in modeling tests
      ae1f8350
    • Yih-Dar's avatar
      Fix LongformerModel hidden states (#15537) · 2f2fefd6
      Yih-Dar authored
      
      
      * add undo padding
      
      * fix
      
      * fix tuple issue
      
      * make style and quality
      
      * move unpad logic to LongformerEncoder + unpad attentions + update tests
      
      * move unpad logic to TFLongformerEncoder
      Co-authored-by: default avatarydshieh <ydshieh@users.noreply.github.com>
      2f2fefd6
    • Yih-Dar's avatar
    • SaulLu's avatar
      fix CLIP fast tokenizer and change some properties of the slow version (#15067) · e93763d4
      SaulLu authored
      
      
      Very big changes concerning the tokenizer fast of CLIP which did not correspond to the tokenizer slow of CLIP
      Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      e93763d4
  2. 17 Feb, 2022 2 commits
    • NielsRogge's avatar
      Add SimMIM (#15586) · 57882177
      NielsRogge authored
      
      
      * Add first draft
      
      * Make model importable
      
      * Make SwinForMaskedImageModeling importable
      
      * Fix imports
      
      * Add missing inits
      
      * Add support for Swin
      
      * Fix bug
      
      * Fix bug
      
      * Fix another bug
      
      * Fix Swin MIM implementation
      
      * Fix default encoder stride
      
      * Fix Swin
      
      * Add print statements for debugging
      
      * Add image_size data argument
      
      * Fix Swin
      
      * Fix image_size
      
      * Add print statements for debugging
      
      * Fix print statement
      
      * Remove print statements
      
      * Improve reshaping of bool_masked_pos
      
      * Add support for DeiT, fix tests
      
      * Improve docstrings
      
      * Apply new black version
      
      * Improve script
      
      * Fix bug
      
      * Improve README
      
      * Apply suggestions from code review
      
      * Remove DS_Store and add to gitignore
      
      * Apply suggestions from code review + fix BEiT Flax
      
      * Revert BEiT changes
      
      * Improve README
      
      * Fix code quality
      
      * Improve README
      Co-authored-by: default avatarNiels Rogge <nielsrogge@Nielss-MBP.localdomain>
      Co-authored-by: default avatarNiels Rogge <nielsrogge@Nielss-MacBook-Pro.local>
      57882177
    • Tanay Mehta's avatar
      Add PoolFormer (#15531) · f84e0dbd
      Tanay Mehta authored
      
      
      * Added all files, PoolFormerFeatureExtractor still failing tests
      
      * Fixed PoolFormerFeatureExtractor not being able to import
      
      * Completed Poolformer doc
      
      * Applied Suggested fixes
      
      * Fixed errors in modeling_auto.py
      
      * Fix feature extractor, convert docs to Markdown, styling of code
      
      * Remove PoolFormer from check_repo and fix integration test
      
      * Remove Poolformer from check_repo
      
      * Fixed configuration_poolformer.py docs and removed inference.py from poolformer
      
      * Ran with black v22
      
      * Added PoolFormer to _toctree.yml
      
      * Updated poolformer doc
      
      * Applied suggested fixes and added on README.md
      
      * Did make fixup and make fix-copies, tests should pass now
      
      * Changed PoolFormer weights conversion script name and fixed README
      
      * Applied fixes in test_modeling_poolformer.py and modeling_poolformer.py
      
      * Added PoolFormerFeatureExtractor to AutoFeatureExtractor API
      Co-authored-by: default avatarNiels Rogge <nielsrogge@Nielss-MBP.localdomain>
      f84e0dbd
  3. 16 Feb, 2022 4 commits
  4. 15 Feb, 2022 8 commits
  5. 14 Feb, 2022 1 commit
    • Sylvain Gugger's avatar
      Register feature extractor (#15634) · 2e11a043
      Sylvain Gugger authored
      * Rework AutoFeatureExtractor.from_pretrained internal
      
      * Custom feature extractor
      
      * Add more tests
      
      * Add support for custom feature extractor code
      
      * Clean up
      
      * Add register API to AutoFeatureExtractor
      2e11a043
  6. 11 Feb, 2022 4 commits
  7. 10 Feb, 2022 4 commits
  8. 09 Feb, 2022 8 commits
  9. 08 Feb, 2022 2 commits
  10. 07 Feb, 2022 3 commits
    • Michael Benayoun's avatar
      FX tracing improvement (#14321) · 0fe17f37
      Michael Benayoun authored
      * Change the way tracing happens, enabling dynamic axes out of the box
      
      * Update the tests and modeling xlnet
      
      * Add the non recoding of leaf modules to avoid recording more values for the methods to record than what will be seen at tracing time (which would otherwise desynchronize the recorded values and the values that need to be given to the proxies during tracing, causing errors).
      
      * Comments and making tracing work for gpt-j and xlnet
      
      * Refactore things related to num_choices (and batch_size, sequence_length)
      
      * Update fx to work on PyTorch 1.10
      
      * Postpone autowrap_function feature usage for later
      
      * Add copyrights
      
      * Remove unnecessary file
      
      * Fix issue with add_new_model_like
      
      * Apply suggestions
      0fe17f37
    • Yih-Dar's avatar
      Fix TF T5/LED missing cross attn in retrun values (#15511) · 131e2584
      Yih-Dar authored
      
      
      * add cross attn to outputs
      
      * add cross attn to outputs for TFLED
      
      * add undo padding
      
      * remove unused import
      
      * fix style
      Co-authored-by: default avatarydshieh <ydshieh@users.noreply.github.com>
      131e2584
    • lewtun's avatar
      6775b211