1. 22 Feb, 2023 1 commit
  2. 07 Feb, 2023 1 commit
  3. 06 Feb, 2023 1 commit
    • Sylvain Gugger's avatar
      Update quality tooling for formatting (#21480) · 6f79d264
      Sylvain Gugger authored
      * Result of black 23.1
      
      * Update target to Python 3.7
      
      * Switch flake8 to ruff
      
      * Configure isort
      
      * Configure isort
      
      * Apply isort with line limit
      
      * Put the right black version
      
      * adapt black in check copies
      
      * Fix copies
      6f79d264
  4. 18 Jan, 2023 2 commits
    • jeffhataws's avatar
      Add AWS Neuron torchrun support (#20806) · c59d71b2
      jeffhataws authored
      * Add XLA torchrun support
      
      * Clarify that currently DDP doesn't work with torch.distributed XLA backend yet
      
      * Enable DDP with torchrun and XLA (now available in PT-XLA 1.13)
      
      * Add check for AWS Neuron availability and AWS Neuron specific compiler flag
      
      * Change the new test's name to TestTrainerDistributedNeuronCore
      
      * Remove "assert" and replace raised exception
      
      * Remove compiler flag as it is optional. If needed, will be another PR.
      
      * Use TORCHELASTIC_RUN_ID to determine whether torchrun is used
      c59d71b2
    • Sylvain Gugger's avatar
      Adapt repository creation to latest hf_hub (#21158) · 05e72aa0
      Sylvain Gugger authored
      * Adapt repository creation to latest hf_hub
      
      * Update all examples
      
      * Fix other tests, add Flax examples
      
      * Address review comments
      05e72aa0
  5. 12 Jan, 2023 1 commit
  6. 20 Dec, 2022 1 commit
  7. 30 Nov, 2022 1 commit
  8. 25 Nov, 2022 1 commit
  9. 18 Nov, 2022 1 commit
    • atturaioe's avatar
      Add AnyPrecisionAdamW optimizer (#18961) · 84c9cc6d
      atturaioe authored
      * Add AnyPrecisionAdamW optimizer
      
      * Add optim_args argument to TrainingArgs
      
      * Add tests for AnyPrecisionOptimizer
      
      * Change AnyPrecisionAdam default params to float32
      
      * Move default_anyprecision_kwargs in trainer test
      
      * Rename AnyPrecisionAdamW
      84c9cc6d
  10. 16 Nov, 2022 1 commit
  11. 15 Sep, 2022 1 commit
  12. 12 Aug, 2022 1 commit
  13. 13 Jul, 2022 1 commit
    • Wei's avatar
      Enable torchdynamo with torch_tensorrt(fx path) (#17765) · 7ea6ccc2
      Wei authored
      
      
      * enable fx2trt
      
      * Update perf_train_gpu_one.mdx
      
      * Update perf_train_gpu_one.mdx
      
      * add lib check
      
      * update
      
      * format
      
      * update
      
      * fix import check
      
      * fix isort
      
      * improve doc
      
      * refactor ctx manager
      
      * fix isort
      
      * black format
      
      * isort fix
      
      * fix format
      
      * update args
      
      * update black
      
      * cleanups
      
      * Update perf_train_gpu_one.mdx
      
      * code refactor
      
      * code refactor to init
      
      * remove redundancy
      
      * isort
      
      * replace self.args with args
      Co-authored-by: default avatarStas Bekman <stas@stason.org>
      7ea6ccc2
  14. 12 Jul, 2022 1 commit
  15. 08 Jul, 2022 1 commit
  16. 01 Jul, 2022 1 commit
  17. 30 Jun, 2022 1 commit
  18. 28 Jun, 2022 1 commit
  19. 21 Jun, 2022 1 commit
  20. 20 Jun, 2022 1 commit
  21. 14 Jun, 2022 1 commit
  22. 08 Jun, 2022 1 commit
  23. 25 May, 2022 1 commit
  24. 18 May, 2022 1 commit
  25. 16 May, 2022 1 commit
  26. 11 May, 2022 2 commits
  27. 09 May, 2022 1 commit
  28. 03 May, 2022 2 commits
  29. 19 Apr, 2022 2 commits
  30. 29 Mar, 2022 1 commit
  31. 23 Mar, 2022 1 commit
    • Sylvain Gugger's avatar
      Reorganize file utils (#16264) · 4975002d
      Sylvain Gugger authored
      * Split file_utils in several submodules
      
      * Fixes
      
      * Add back more objects
      
      * More fixes
      
      * Who exactly decided to import that from there?
      
      * Second suggestion to code with code review
      
      * Revert wront move
      
      * Fix imports
      
      * Adapt all imports
      
      * Adapt all imports everywhere
      
      * Revert this import, will fix in a separate commit
      4975002d
  32. 08 Mar, 2022 1 commit
  33. 23 Feb, 2022 1 commit