1. 09 Sep, 2022 1 commit
  2. 02 Sep, 2022 1 commit
  3. 01 Sep, 2022 2 commits
  4. 31 Aug, 2022 2 commits
  5. 08 Aug, 2022 2 commits
  6. 05 Aug, 2022 1 commit
  7. 03 Aug, 2022 1 commit
  8. 01 Aug, 2022 2 commits
  9. 27 Jul, 2022 1 commit
  10. 08 Jul, 2022 1 commit
  11. 05 Jul, 2022 1 commit
  12. 28 Jun, 2022 2 commits
  13. 27 Jun, 2022 1 commit
    • Matt's avatar
      Add a TF in-graph tokenizer for BERT (#17701) · ee0d001d
      Matt authored
      * Add a TF in-graph tokenizer for BERT
      
      * Add from_pretrained
      
      * Add proper truncation, option handling to match other tokenizers
      
      * Add proper imports and guards
      
      * Add test, fix all the bugs exposed by said test
      
      * Fix truncation of paired texts in graph mode, more test updates
      
      * Small fixes, add a (very careful) test for savedmodel
      
      * Add tensorflow-text dependency, make fixup
      
      * Update documentation
      
      * Update documentation
      
      * make fixup
      
      * Slight changes to tests
      
      * Add some docstring examples
      
      * Update tests
      
      * Update tests and add proper lowercasing/normalization
      
      * make fixup
      
      * Add docstring for padding!
      
      * Mark slow tests
      
      * make fixup
      
      * Fall back to BertTokenizerFast if BertTokenizer is unavailable
      
      * Fall back to BertTokenizerFast if BertTokenizer is unavailable
      
      * make fixup
      
      * Properly handle tensorflow-text dummies
      ee0d001d
  14. 17 Jun, 2022 1 commit
    • Sourab Mangrulkar's avatar
      Migrate HFDeepSpeedConfig from trfrs to accelerate (#17623) · 21a77242
      Sourab Mangrulkar authored
      
      
      * Migrate HFDeepSpeedConfig from trfrs to accelerate
      
      * add `accelerate` to testing dep
      
      * addressing comments
      
      * addressing comments
      
      Using `_shared_state` and avoiding object creation. This is necessary as `notebook_launcher` in `launcers.py` checks `len(AcceleratorState._shared_state)>0` to throw an error.
      
      * resolving comments
      
      1. Use simple API from accelerate to manage the deepspeed config integration
      2. Update the related documentation
      
      * reverting changes and addressing comments
      
      * docstring correction
      
      * addressing nits
      
      * addressing nits
      
      * addressing nits 3
      
      * bumping up the accelerate version to 0.10.0
      
      * resolving import
      
      * update setup.py to include deepspeed dependencies
      
      * Update dependency_versions_table.py
      
      * fixing imports
      
      * reverting changes to CI dependencies for "run_tests_pipelines_tf*" tests
      
      These changes didn't help with resolving the failures and I believe this needs to be addressed in another PR.
      
      * removing `accelerate` as hard dependency
      
      Resolves issues related to CI Tests
      
      * adding `accelerate` as dependency for building docs
      
      resolves failure in Build PR Documentation test
      
      * adding `accelerate` as dependency in "dev" to resolve doc build issue
      
      * resolving comments
      
      1. adding `accelerate` to extras["all"]
      2. Including check for accelerate too before import HFDeepSpeedConfig from there
      Co-Authored-By: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      
      * resolving comments
      Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      21a77242
  15. 16 Jun, 2022 1 commit
  16. 02 Jun, 2022 2 commits
  17. 26 May, 2022 1 commit
  18. 23 May, 2022 1 commit
  19. 20 May, 2022 1 commit
  20. 12 May, 2022 2 commits
  21. 10 May, 2022 1 commit
    • Stas Bekman's avatar
      [Deepspeed] add many more models to the model zoo test (#12695) · f8615044
      Stas Bekman authored
      * model zoo take 2
      
      * add deberta
      
      * new param for zero2
      
      * doc update
      
      * doc update
      
      * add layoutlm
      
      * bump deepspeed
      
      * add deberta-v2, funnel, longformer
      
      * new models
      
      * style
      
      * add t5_v1
      
      * update TAPAS status
      
      * reorg problematic models
      
      * move doc to another PR
      
      * style
      
      * fix checkpoint check test
      
      * making progress on more models running
      
      * cleanup
      
      * new version
      
      * cleanup
      f8615044
  22. 09 May, 2022 1 commit
  23. 04 May, 2022 1 commit
  24. 02 May, 2022 2 commits
  25. 29 Apr, 2022 1 commit
  26. 28 Apr, 2022 1 commit
  27. 17 Apr, 2022 1 commit
  28. 15 Apr, 2022 1 commit
  29. 06 Apr, 2022 1 commit
  30. 01 Apr, 2022 1 commit
  31. 28 Mar, 2022 1 commit
  32. 24 Mar, 2022 1 commit