"docs/zh_cn/vscode:/vscode.git/clone" did not exist on "0bcbeadb532b32f268d6dfb7d3d4af86b2bdc4ac"
  1. 24 Dec, 2018 1 commit
  2. 21 Dec, 2018 1 commit
  3. 20 Dec, 2018 2 commits
  4. 21 Nov, 2018 1 commit
    • josh11b's avatar
      cross_tower_ops -> cross_device_ops (#5776) · 9a4848a2
      josh11b authored
      We've deprecated the "tower" terminology in DistributionStrategy, so the "cross_tower_ops" argument is now "cross_device_ops", matching the current name of "AllReduceCrossDeviceOps".
      9a4848a2
  5. 25 Oct, 2018 1 commit
  6. 24 Oct, 2018 1 commit
  7. 12 Oct, 2018 1 commit
  8. 12 Jun, 2018 1 commit
    • Katherine Wu's avatar
      Transformer multi gpu, remove multi_gpu flag, distribution helper functions (#4457) · 29c9f985
      Katherine Wu authored
      * Add DistributionStrategy to transformer model
      
      * add num_gpu flag
      
      * Calculate per device batch size for transformer
      
      * remove reference to flags_core
      
      * Add synthetic data option to transformer
      
      * fix typo
      
      * add import back in
      
      * Use hierarchical copy
      
      * address PR comments
      
      * lint
      
      * fix spaces
      
      * group train op together to fix single GPU error
      
      * Fix translate bug (sorted_keys is a dict, not a list)
      
      * Change params to a default dict (translate.py was throwing errors because params didn't have the TPU parameters.)
      
      * Address PR comments. Removed multi gpu flag + more
      
      * fix lint
      
      * fix more lints
      
      * add todo for Synthetic dataset
      
      * Update docs
      29c9f985