1. 23 Jun, 2021 1 commit
    • Reed Wanderman-Milne's avatar
      Return default strategy from get_distribution_strategy when given "off". · 9f9d07e9
      Reed Wanderman-Milne authored
      Before, it returned None. But almost every use of get_distribution_strategy() assumes an actual strategy is returned and crashes when None is returned. Returning the default strategy fixes these issues and is equivalent to using no strategy, as the default strategy is always in effect when no other strategy is used.
      
      PiperOrigin-RevId: 380951055
      9f9d07e9
  2. 08 Mar, 2021 1 commit
  3. 25 Feb, 2021 1 commit
  4. 05 Jan, 2021 1 commit
  5. 29 Dec, 2020 1 commit
  6. 17 Nov, 2020 1 commit
  7. 01 Oct, 2020 1 commit
    • Rick Chao's avatar
      PSv2: Replace existing `tf.distribute.experimental.ParameterServerStrategy`... · 4680f2fa
      Rick Chao authored
      PSv2: Replace existing `tf.distribute.experimental.ParameterServerStrategy` usage with `tf.compat.v1.distribute.experimental.ParameterServerStrategy` to prepare for the upcoming TF2 ParameterServerStrategy API release.
      
      The practically only difference for API endpoint switch is the monitoring from V2 to V1, for those who were using `tf.distribute.experimental.ParameterServerStrategy`. It's not supported in V2 and should be tracked as V1 anyway.
      
      PiperOrigin-RevId: 334847114
      4680f2fa
  8. 17 Sep, 2020 1 commit
  9. 13 Sep, 2020 1 commit
  10. 12 Aug, 2020 2 commits
  11. 10 Apr, 2020 1 commit
  12. 30 Mar, 2020 1 commit
  13. 17 Mar, 2020 1 commit
  14. 13 Feb, 2020 1 commit
  15. 27 Jan, 2020 1 commit
  16. 15 Dec, 2019 1 commit
  17. 14 Dec, 2019 2 commits
  18. 27 Nov, 2019 1 commit
  19. 24 Sep, 2019 1 commit
  20. 19 Aug, 2019 1 commit
  21. 16 Aug, 2019 2 commits
  22. 12 Aug, 2019 1 commit
    • Hongjun Choi's avatar
      Merged commit includes the following changes: (#7430) · 03b4a0af
      Hongjun Choi authored
      262988559  by A. Unique TensorFlower<gardener@tensorflow.org>:
      
          Enable NCF TF 2.0 model to run on TPUStrategy.
      
      --
      262971756  by A. Unique TensorFlower<gardener@tensorflow.org>:
      
          Internal change
      
      262967691  by hongkuny<hongkuny@google.com>:
      
          Internal
      
      --
      
      PiperOrigin-RevId: 262988559
      03b4a0af
  23. 02 Jul, 2019 1 commit
  24. 29 Apr, 2019 1 commit
  25. 26 Apr, 2019 1 commit
  26. 25 Apr, 2019 1 commit
  27. 24 Apr, 2019 1 commit
  28. 08 Apr, 2019 1 commit
    • Shining Sun's avatar
      Add DS support for NCF keras (#6447) · 1255d5b9
      Shining Sun authored
      * add ds support for ncf
      
      * remove comments for in_top_k
      
      * avoid expanding the input layers
      
      * resolve comments and fix lint
      
      * Added some comments in code and fix lint
      
      * fix lint
      
      * add some documentation
      
      * add tensorflow imports
      1255d5b9
  29. 01 Apr, 2019 1 commit
  30. 19 Mar, 2019 1 commit
  31. 07 Mar, 2019 1 commit
  32. 02 Mar, 2019 1 commit
  33. 01 Mar, 2019 1 commit
    • Shining Sun's avatar
      Keras-fy NCF Model (#6092) · 048e5bff
      Shining Sun authored
      * tmp commit
      
      * tmp commit
      
      * first attempt (without eval)
      
      * Bug fixes
      
      * bug fixes
      
      * training done
      
      * Loss NAN, no eval
      
      * Loss weight problem solved
      
      * resolve the NAN loss problem
      
      * Problem solved. Clean up needed
      
      * Added a todo
      
      * Remove debug prints
      
      * Extract get_optimizer to ncf_common
      
      * Move metrics computation back to neumf; use DS.scope api
      
      * Extract DS.scope code to utils
      
      * lint fixes
      
      * Move obtaining DS above producer.start to avoid race condition
      
      * move pt 1
      
      * move pt 2
      
      * Update the run script
      
      * Wrap keras_model related code into functions
      
      * Update the doc for softmax_logitfy and change the method name
      
      * Resolve PR comments
      
      * working version with: eager, DS, batch and no masks
      
      * Remove git conflict indicator
      
      * move reshape to neumf_model
      
      * working version, not converge
      
      * converged
      
      * fix a test
      
      * more lint fix
      
      * more lint fix
      
      * more lint fixes
      
      * more lint fix
      
      * Removed unused imports
      
      * fix test
      
      * dummy commit for kicking of checks
      
      * fix lint issue
      
      * dummy input to kick off checks
      
      * dummy input to kick off checks
      
      * add collective to dist strat
      
      * addressed review comments
      
      * add a doc string
      048e5bff
  34. 28 Feb, 2019 1 commit
  35. 21 Feb, 2019 1 commit
    • Ayush Dubey's avatar
      Multi-worker support for Resnet. (#6206) · f2e90945
      Ayush Dubey authored
      * Update official resnet for multi worker training with distribution strategies.
      
      * Fixes for multi worker training.
      
      * Fix call to `get_distribution_strategy`.
      
      * Undo test change.
      
      * Fix spacing.
      
      * Move cluster configuration to distribution_utils.
      
      * Move train_and_evaluate out of loop.  Also, update docstrings for multi-worker flags and add use_train_and_evaluate flag.
      
      * Update distribution_strategy flag to match exported name for collective strategy.
      f2e90945
  36. 14 Feb, 2019 1 commit
  37. 13 Feb, 2019 1 commit