1. 18 Jan, 2023 1 commit
  2. 12 Jan, 2023 1 commit
  3. 20 Dec, 2022 1 commit
  4. 30 Nov, 2022 1 commit
  5. 25 Nov, 2022 1 commit
  6. 18 Nov, 2022 1 commit
    • atturaioe's avatar
      Add AnyPrecisionAdamW optimizer (#18961) · 84c9cc6d
      atturaioe authored
      * Add AnyPrecisionAdamW optimizer
      
      * Add optim_args argument to TrainingArgs
      
      * Add tests for AnyPrecisionOptimizer
      
      * Change AnyPrecisionAdam default params to float32
      
      * Move default_anyprecision_kwargs in trainer test
      
      * Rename AnyPrecisionAdamW
      84c9cc6d
  7. 15 Sep, 2022 1 commit
  8. 12 Aug, 2022 1 commit
  9. 13 Jul, 2022 1 commit
    • Wei's avatar
      Enable torchdynamo with torch_tensorrt(fx path) (#17765) · 7ea6ccc2
      Wei authored
      
      
      * enable fx2trt
      
      * Update perf_train_gpu_one.mdx
      
      * Update perf_train_gpu_one.mdx
      
      * add lib check
      
      * update
      
      * format
      
      * update
      
      * fix import check
      
      * fix isort
      
      * improve doc
      
      * refactor ctx manager
      
      * fix isort
      
      * black format
      
      * isort fix
      
      * fix format
      
      * update args
      
      * update black
      
      * cleanups
      
      * Update perf_train_gpu_one.mdx
      
      * code refactor
      
      * code refactor to init
      
      * remove redundancy
      
      * isort
      
      * replace self.args with args
      Co-authored-by: default avatarStas Bekman <stas@stason.org>
      7ea6ccc2
  10. 12 Jul, 2022 1 commit
  11. 01 Jul, 2022 1 commit
  12. 30 Jun, 2022 1 commit
  13. 28 Jun, 2022 1 commit
  14. 21 Jun, 2022 1 commit
  15. 20 Jun, 2022 1 commit
  16. 14 Jun, 2022 1 commit
  17. 08 Jun, 2022 1 commit
  18. 25 May, 2022 1 commit
  19. 18 May, 2022 1 commit
  20. 16 May, 2022 1 commit
  21. 11 May, 2022 1 commit
  22. 09 May, 2022 1 commit
  23. 03 May, 2022 2 commits
  24. 19 Apr, 2022 2 commits
  25. 29 Mar, 2022 1 commit
  26. 23 Mar, 2022 1 commit
    • Sylvain Gugger's avatar
      Reorganize file utils (#16264) · 4975002d
      Sylvain Gugger authored
      * Split file_utils in several submodules
      
      * Fixes
      
      * Add back more objects
      
      * More fixes
      
      * Who exactly decided to import that from there?
      
      * Second suggestion to code with code review
      
      * Revert wront move
      
      * Fix imports
      
      * Adapt all imports
      
      * Adapt all imports everywhere
      
      * Revert this import, will fix in a separate commit
      4975002d
  27. 08 Mar, 2022 1 commit
  28. 23 Feb, 2022 1 commit
  29. 09 Feb, 2022 1 commit
  30. 03 Feb, 2022 1 commit
  31. 02 Feb, 2022 1 commit
    • Ayush Chaurasia's avatar
      Add W&B backend for hyperparameter sweep (#14582) · c74f3d4c
      Ayush Chaurasia authored
      # Add support for W&B hyperparameter sweep
      This PR:
      * allows using wandb for running hyperparameter search.
      * The runs are visualized on W&B sweeps dashboard
      * This supports runnning sweeps on parallel devices, all reporting to the same central dashboard.
      
      ### Usage
      **To run new a hyperparameter search:**
      ```
      trainer.hyperparameter_search(
          backend="wandb", 
          project="transformers_sweep", # name of the project
          n_trials=5,
          metric="eval/loss", # metric to be optimized, default 'eval/loss'. A warning is raised if the passed metric is not found
      )
      ```
      This outputs a sweep id. Eg. `my_project/sweep_id`
      
      **To run sweeps on parallel devices:**
      Just pass sweep id which you want to run parallel
      ```
      trainer.hyperparameter_search(
          backend="wandb", 
          sweep_id = "my_project/sweep_id"
      )
      ```
      c74f3d4c
  32. 13 Jan, 2022 1 commit
  33. 11 Jan, 2022 1 commit
  34. 23 Dec, 2021 1 commit
  35. 16 Dec, 2021 1 commit
  36. 03 Dec, 2021 1 commit
  37. 01 Dec, 2021 1 commit
  38. 18 Nov, 2021 1 commit