1. 27 Aug, 2021 1 commit
  2. 23 Aug, 2021 1 commit
  3. 12 Aug, 2021 1 commit
  4. 11 Aug, 2021 1 commit
  5. 06 Aug, 2021 1 commit
  6. 05 Aug, 2021 1 commit
  7. 03 Aug, 2021 1 commit
  8. 02 Aug, 2021 1 commit
  9. 01 Aug, 2021 1 commit
  10. 24 Jul, 2021 1 commit
  11. 09 Jul, 2021 1 commit
    • Reed Wanderman-Milne's avatar
      With float16, always use LossScaleOptimizer. · be3575f5
      Reed Wanderman-Milne authored
      Before, it was too easy to accidentally forget to set runtime.loss_scale, which had to always be done if mixed precision is used, otherwise the model would converge to worse accuracy. Now, all that needs to be done to use mixed precision is to set runtime.mixed_precision_dtype=float16.
      
      PiperOrigin-RevId: 383767033
      be3575f5
  12. 05 Jul, 2021 1 commit
  13. 01 Jul, 2021 1 commit
  14. 30 Jun, 2021 1 commit
  15. 24 Jun, 2021 2 commits
  16. 07 Jun, 2021 2 commits
  17. 04 Jun, 2021 2 commits
  18. 03 Jun, 2021 2 commits
  19. 28 May, 2021 2 commits
  20. 26 May, 2021 2 commits
  21. 24 May, 2021 2 commits
  22. 17 May, 2021 2 commits
  23. 14 May, 2021 2 commits
  24. 13 May, 2021 2 commits
  25. 08 May, 2021 4 commits
  26. 30 Apr, 2021 2 commits