"tests/models/bert/test_modeling_bert.py" did not exist on "9c83b96e627427aeeada7a84bae026cafc17ccd2"
  1. 17 Jun, 2022 1 commit
    • Sourab Mangrulkar's avatar
      Migrate HFDeepSpeedConfig from trfrs to accelerate (#17623) · 21a77242
      Sourab Mangrulkar authored
      
      
      * Migrate HFDeepSpeedConfig from trfrs to accelerate
      
      * add `accelerate` to testing dep
      
      * addressing comments
      
      * addressing comments
      
      Using `_shared_state` and avoiding object creation. This is necessary as `notebook_launcher` in `launcers.py` checks `len(AcceleratorState._shared_state)>0` to throw an error.
      
      * resolving comments
      
      1. Use simple API from accelerate to manage the deepspeed config integration
      2. Update the related documentation
      
      * reverting changes and addressing comments
      
      * docstring correction
      
      * addressing nits
      
      * addressing nits
      
      * addressing nits 3
      
      * bumping up the accelerate version to 0.10.0
      
      * resolving import
      
      * update setup.py to include deepspeed dependencies
      
      * Update dependency_versions_table.py
      
      * fixing imports
      
      * reverting changes to CI dependencies for "run_tests_pipelines_tf*" tests
      
      These changes didn't help with resolving the failures and I believe this needs to be addressed in another PR.
      
      * removing `accelerate` as hard dependency
      
      Resolves issues related to CI Tests
      
      * adding `accelerate` as dependency for building docs
      
      resolves failure in Build PR Documentation test
      
      * adding `accelerate` as dependency in "dev" to resolve doc build issue
      
      * resolving comments
      
      1. adding `accelerate` to extras["all"]
      2. Including check for accelerate too before import HFDeepSpeedConfig from there
      Co-Authored-By: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      
      * resolving comments
      Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      21a77242
  2. 16 Jun, 2022 1 commit
  3. 02 Jun, 2022 2 commits
  4. 26 May, 2022 1 commit
  5. 23 May, 2022 1 commit
  6. 20 May, 2022 1 commit
  7. 12 May, 2022 2 commits
  8. 10 May, 2022 1 commit
    • Stas Bekman's avatar
      [Deepspeed] add many more models to the model zoo test (#12695) · f8615044
      Stas Bekman authored
      * model zoo take 2
      
      * add deberta
      
      * new param for zero2
      
      * doc update
      
      * doc update
      
      * add layoutlm
      
      * bump deepspeed
      
      * add deberta-v2, funnel, longformer
      
      * new models
      
      * style
      
      * add t5_v1
      
      * update TAPAS status
      
      * reorg problematic models
      
      * move doc to another PR
      
      * style
      
      * fix checkpoint check test
      
      * making progress on more models running
      
      * cleanup
      
      * new version
      
      * cleanup
      f8615044
  9. 09 May, 2022 1 commit
  10. 04 May, 2022 1 commit
  11. 02 May, 2022 2 commits
  12. 29 Apr, 2022 1 commit
  13. 28 Apr, 2022 1 commit
  14. 17 Apr, 2022 1 commit
  15. 15 Apr, 2022 1 commit
  16. 06 Apr, 2022 1 commit
  17. 01 Apr, 2022 1 commit
  18. 28 Mar, 2022 1 commit
  19. 24 Mar, 2022 1 commit
  20. 23 Mar, 2022 1 commit
  21. 18 Mar, 2022 1 commit
  22. 12 Mar, 2022 1 commit
    • Stas Bekman's avatar
      [Deepspeed] add support for bf16 mode (#14569) · 580dd87c
      Stas Bekman authored
      
      
      * [WIP] add support for bf16 mode
      
      * prep for bf16
      
      * prep for bf16
      
      * fix; zero2/bf16 is ok
      
      * check bf16 is available
      
      * test fixes
      
      * enable zero3_bf16
      
      * config files
      
      * docs
      
      * split stage_dtype; merge back to non-dtype-specific config file
      
      * fix doc
      
      * cleanup
      
      * cleanup
      
      * bfloat16 => bf16 to match the PR changes
      
      * s/zero_gather_fp16_weights_on_model_save/zero_gather_16bit_weights_on_model_save/; s/save_fp16_model/save_16bit_model/
      
      * test fixes/skipping
      
      * move
      
      * fix
      
      * Update docs/source/main_classes/deepspeed.mdx
      Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      
      * backticks
      
      * cleanup
      
      * cleanup
      
      * cleanup
      
      * new version
      
      * add note about grad accum in bf16
      Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      580dd87c
  23. 03 Mar, 2022 1 commit
  24. 01 Mar, 2022 1 commit
  25. 18 Feb, 2022 1 commit
  26. 15 Feb, 2022 1 commit
  27. 09 Feb, 2022 1 commit
  28. 28 Jan, 2022 1 commit
  29. 27 Jan, 2022 2 commits
  30. 18 Jan, 2022 1 commit
  31. 17 Jan, 2022 1 commit
  32. 14 Jan, 2022 1 commit
  33. 30 Dec, 2021 1 commit
    • Nicolas Patry's avatar
      Enabling `tokenizers` upgrade. (#14941) · 08cb5718
      Nicolas Patry authored
      * Enabling `tokenizers` upgrade.
      
      * Moved ugly comment.
      
      * Tokenizers==0.11.1 needs an update to keep borrow checker
      
      happy in highly contiguous calls.
      
      * Support both 0.11.1 and 0.11.0
      08cb5718
  34. 22 Dec, 2021 2 commits
  35. 17 Dec, 2021 1 commit