1. 10 Sep, 2021 1 commit
  2. 27 Aug, 2021 1 commit
  3. 23 Aug, 2021 1 commit
  4. 12 Aug, 2021 1 commit
  5. 11 Aug, 2021 1 commit
  6. 06 Aug, 2021 1 commit
  7. 05 Aug, 2021 1 commit
  8. 03 Aug, 2021 1 commit
  9. 02 Aug, 2021 1 commit
  10. 01 Aug, 2021 1 commit
  11. 24 Jul, 2021 1 commit
  12. 09 Jul, 2021 1 commit
    • Reed Wanderman-Milne's avatar
      With float16, always use LossScaleOptimizer. · 21286f77
      Reed Wanderman-Milne authored
      Before, it was too easy to accidentally forget to set runtime.loss_scale, which had to always be done if mixed precision is used, otherwise the model would converge to worse accuracy. Now, all that needs to be done to use mixed precision is to set runtime.mixed_precision_dtype=float16.
      
      PiperOrigin-RevId: 383767033
      21286f77
  13. 05 Jul, 2021 1 commit
  14. 01 Jul, 2021 1 commit
  15. 30 Jun, 2021 1 commit
  16. 24 Jun, 2021 2 commits
  17. 07 Jun, 2021 1 commit
  18. 04 Jun, 2021 1 commit
  19. 03 Jun, 2021 1 commit
  20. 28 May, 2021 1 commit
  21. 26 May, 2021 1 commit
  22. 24 May, 2021 1 commit
  23. 17 May, 2021 1 commit
  24. 14 May, 2021 1 commit
  25. 13 May, 2021 1 commit
  26. 08 May, 2021 2 commits
  27. 30 Apr, 2021 2 commits
  28. 26 Apr, 2021 1 commit
  29. 21 Apr, 2021 1 commit
  30. 20 Apr, 2021 1 commit
  31. 14 Apr, 2021 2 commits
  32. 12 Apr, 2021 1 commit
    • Reed Wanderman-Milne's avatar
      Use nonexperimental mixed precision API for official models. · 0d8f9807
      Reed Wanderman-Milne authored
      For all modified calls to set_mixed_precision_policy(), the loss_scale argument was removed, as it cannot be passed if the nonexperimental API is used. For all such callers, the loss_scale is later used to explicitly create a LossScaleOptimizer, so removing the argument has no impact.
      
      Switching to the non-experimental LossScaleOptimizer has no effect, as it has near identical behavior and all isinstance checks within the official models check for the non-experimental version.
      
      PiperOrigin-RevId: 368101975
      0d8f9807
  33. 06 Apr, 2021 3 commits
  34. 30 Mar, 2021 1 commit