1. 09 Jul, 2021 2 commits
    • Dan Kondratyuk's avatar
      Internal change · 1437baae
      Dan Kondratyuk authored
      PiperOrigin-RevId: 383856955
      1437baae
    • Reed Wanderman-Milne's avatar
      With float16, always use LossScaleOptimizer. · be3575f5
      Reed Wanderman-Milne authored
      Before, it was too easy to accidentally forget to set runtime.loss_scale, which had to always be done if mixed precision is used, otherwise the model would converge to worse accuracy. Now, all that needs to be done to use mixed precision is to set runtime.mixed_precision_dtype=float16.
      
      PiperOrigin-RevId: 383767033
      be3575f5
  2. 08 Jul, 2021 3 commits
  3. 07 Jul, 2021 2 commits
  4. 05 Jul, 2021 1 commit
  5. 04 Jul, 2021 1 commit
  6. 03 Jul, 2021 1 commit
  7. 02 Jul, 2021 2 commits
  8. 01 Jul, 2021 7 commits
  9. 30 Jun, 2021 4 commits
  10. 29 Jun, 2021 2 commits
  11. 28 Jun, 2021 1 commit
  12. 25 Jun, 2021 5 commits
  13. 24 Jun, 2021 8 commits
  14. 23 Jun, 2021 1 commit