1. 25 Aug, 2023 1 commit
  2. 31 May, 2023 1 commit
    • Sourab Mangrulkar's avatar
      accelerate deepspeed and gradient accumulation integrate (#23236) · a73b1d59
      Sourab Mangrulkar authored
      * mixed precision support via accelerate
      
      * fix issues
      
      * fix for the sharded ddp case
      
      * fix flax and tf failing tests
      
      * `refactor the place to create `Accelerator` object
      
      * move ddp prep to accelerate
      
      * fix 😅
      
      * resolving comments
      
      * move fsdp handling to accelerate
      
      * fixex
      
      * fix saving
      
      * shift torch dynamo handling to accelerate
      
      * shift deepspeed integration and save & load utils to accelerate
      
      * fix accelerate launcher support
      
      * oops
      
      * fix 🐛
      
      * save ckpt fix
      
      * Trigger CI
      
      * nasty 🐛 😅
      
      * as deepspeed needs grad_acc fixes, transfer grad_acc to accelerate
      
      * make tests happy
      
      * quality 
      
      * loss tracked needs to account for grad_acc
      
      * fixing the deepspeed tests
      
      * quality 
      
      * 😅😅😅
      
      * tests 😡
      
      * quality 
      
      
      
      * Trigger CI
      
      * resolve comments and fix the issue with the previous merge from branch
      
      * Trigger CI
      
      * accelerate took over deepspeed integration
      
      ---------
      Co-authored-by: default avatarStas Bekman <stas@stason.org>
      a73b1d59
  3. 11 Apr, 2023 1 commit
  4. 09 Mar, 2023 1 commit
  5. 23 Feb, 2023 1 commit
  6. 22 Feb, 2023 1 commit
  7. 08 Feb, 2023 1 commit
  8. 06 Feb, 2023 1 commit
    • Sylvain Gugger's avatar
      Update quality tooling for formatting (#21480) · 6f79d264
      Sylvain Gugger authored
      * Result of black 23.1
      
      * Update target to Python 3.7
      
      * Switch flake8 to ruff
      
      * Configure isort
      
      * Configure isort
      
      * Apply isort with line limit
      
      * Put the right black version
      
      * adapt black in check copies
      
      * Fix copies
      6f79d264
  9. 16 Jun, 2022 1 commit
  10. 06 Jun, 2022 1 commit
  11. 03 Jun, 2022 1 commit
  12. 02 Jun, 2022 1 commit
  13. 10 May, 2022 1 commit
    • Stas Bekman's avatar
      [Deepspeed] add many more models to the model zoo test (#12695) · f8615044
      Stas Bekman authored
      * model zoo take 2
      
      * add deberta
      
      * new param for zero2
      
      * doc update
      
      * doc update
      
      * add layoutlm
      
      * bump deepspeed
      
      * add deberta-v2, funnel, longformer
      
      * new models
      
      * style
      
      * add t5_v1
      
      * update TAPAS status
      
      * reorg problematic models
      
      * move doc to another PR
      
      * style
      
      * fix checkpoint check test
      
      * making progress on more models running
      
      * cleanup
      
      * new version
      
      * cleanup
      f8615044
  14. 15 Apr, 2022 1 commit
  15. 23 Mar, 2022 1 commit
    • Sylvain Gugger's avatar
      Reorganize file utils (#16264) · 4975002d
      Sylvain Gugger authored
      * Split file_utils in several submodules
      
      * Fixes
      
      * Add back more objects
      
      * More fixes
      
      * Who exactly decided to import that from there?
      
      * Second suggestion to code with code review
      
      * Revert wront move
      
      * Fix imports
      
      * Adapt all imports
      
      * Adapt all imports everywhere
      
      * Revert this import, will fix in a separate commit
      4975002d
  16. 12 Mar, 2022 1 commit
    • Stas Bekman's avatar
      [Deepspeed] add support for bf16 mode (#14569) · 580dd87c
      Stas Bekman authored
      
      
      * [WIP] add support for bf16 mode
      
      * prep for bf16
      
      * prep for bf16
      
      * fix; zero2/bf16 is ok
      
      * check bf16 is available
      
      * test fixes
      
      * enable zero3_bf16
      
      * config files
      
      * docs
      
      * split stage_dtype; merge back to non-dtype-specific config file
      
      * fix doc
      
      * cleanup
      
      * cleanup
      
      * bfloat16 => bf16 to match the PR changes
      
      * s/zero_gather_fp16_weights_on_model_save/zero_gather_16bit_weights_on_model_save/; s/save_fp16_model/save_16bit_model/
      
      * test fixes/skipping
      
      * move
      
      * fix
      
      * Update docs/source/main_classes/deepspeed.mdx
      Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      
      * backticks
      
      * cleanup
      
      * cleanup
      
      * cleanup
      
      * new version
      
      * add note about grad accum in bf16
      Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      580dd87c
  17. 02 Mar, 2022 1 commit
  18. 23 Feb, 2022 1 commit
  19. 03 Feb, 2022 1 commit
  20. 07 Dec, 2021 1 commit
  21. 23 Nov, 2021 1 commit
  22. 11 Nov, 2021 1 commit
  23. 08 Nov, 2021 1 commit
  24. 30 Aug, 2021 1 commit
  25. 23 Jul, 2021 1 commit
  26. 14 Jul, 2021 1 commit
  27. 13 Jul, 2021 1 commit
  28. 22 Jun, 2021 1 commit
  29. 08 Jun, 2021 2 commits
  30. 04 Jun, 2021 1 commit
  31. 02 Jun, 2021 2 commits
  32. 01 Jun, 2021 1 commit
  33. 21 May, 2021 1 commit
  34. 06 May, 2021 1 commit
  35. 30 Apr, 2021 1 commit
    • Stas Bekman's avatar
      [DeepSpeed] fp32 support (#11499) · 4e7bf94e
      Stas Bekman authored
      * prep for deepspeed==0.3.16
      
      * new version
      
      * too soon
      
      * support and test fp32 mode
      
      * troubleshooting doc start
      
      * workaround no longer needed
      
      * add fp32 doc
      
      * style
      
      * cleanup, add tf32 note
      
      * clarify
      
      * release was made
      4e7bf94e
  36. 26 Apr, 2021 3 commits