1. 09 Feb, 2022 4 commits
  2. 08 Feb, 2022 6 commits
  3. 07 Feb, 2022 12 commits
  4. 04 Feb, 2022 6 commits
  5. 03 Feb, 2022 8 commits
  6. 02 Feb, 2022 4 commits
    • CHI LIU's avatar
      Correct eos_token_id settings in generate (#15403) · 5ec368d7
      CHI LIU authored
      * Correct eos_token_id set in generate
      
      * Set eos_token_id in test
      
      * Correct eos_token_id set in generate
      
      * Set eos_token_id in test
      5ec368d7
    • SaulLu's avatar
      fix set truncation attribute in `__init__` of `PreTrainedTokenizerBase` (#15456) · 39b5d1a6
      SaulLu authored
      
      
      * change truncation_side in init of `PreTrainedTokenizerBase`
      Co-authored-by: default avatarLSinev <LSinev@users.noreply.github.com>
      
      * add test
      
      * Revert "replace assert with exception for `padding_side` arg in `PreTrainedTokenizerBase` `__init__`"
      
      This reverts commit 7a98b87962d2635c7e4d4f00db3948b694624843.
      
      * fix kwargs
      
      * Revert "fix kwargs"
      
      This reverts commit 67b0a5270e8cf1dbf70e6b0232e94c0452b6946f.
      
      * Update tests/test_tokenization_common.py
      Co-authored-by: default avatarNicolas Patry <patry.nicolas@protonmail.com>
      
      * delete truncation_side variable
      
      * reorganize test
      
      * format
      
      * complete doc
      
      * Revert "Revert "replace assert with exception for `padding_side` arg in `PreTrainedTokenizerBase` `__init__`""
      
      This reverts commit d5a10a7e2680539e5d9e98ae5d896c893d224b80.
      
      * fix typo
      
      * fix typos to render documentation
      
      * Revert "Revert "Revert "replace assert with exception for `padding_side` arg in `PreTrainedTokenizerBase` `__init__`"""
      
      This reverts commit 16cf58811943a08f43409a7c83eaa330686591d0.
      
      * format
      Co-authored-by: default avatarLSinev <LSinev@users.noreply.github.com>
      Co-authored-by: default avatarNicolas Patry <patry.nicolas@protonmail.com>
      39b5d1a6
    • Sylvain Gugger's avatar
      Fix labels stored in model config for token classification examples (#15482) · 45cac3fa
      Sylvain Gugger authored
      * Playing
      
      * Properly set labels in model config for token classification example
      
      * Port to run_ner_no_trainer
      
      * Quality
      45cac3fa
    • Ayush Chaurasia's avatar
      Add W&B backend for hyperparameter sweep (#14582) · c74f3d4c
      Ayush Chaurasia authored
      # Add support for W&B hyperparameter sweep
      This PR:
      * allows using wandb for running hyperparameter search.
      * The runs are visualized on W&B sweeps dashboard
      * This supports runnning sweeps on parallel devices, all reporting to the same central dashboard.
      
      ### Usage
      **To run new a hyperparameter search:**
      ```
      trainer.hyperparameter_search(
          backend="wandb", 
          project="transformers_sweep", # name of the project
          n_trials=5,
          metric="eval/loss", # metric to be optimized, default 'eval/loss'. A warning is raised if the passed metric is not found
      )
      ```
      This outputs a sweep id. Eg. `my_project/sweep_id`
      
      **To run sweeps on parallel devices:**
      Just pass sweep id which you want to run parallel
      ```
      trainer.hyperparameter_search(
          backend="wandb", 
          sweep_id = "my_project/sweep_id"
      )
      ```
      c74f3d4c