1. 09 Jul, 2021 1 commit
    • Reed Wanderman-Milne's avatar
      With float16, always use LossScaleOptimizer. · be3575f5
      Reed Wanderman-Milne authored
      Before, it was too easy to accidentally forget to set runtime.loss_scale, which had to always be done if mixed precision is used, otherwise the model would converge to worse accuracy. Now, all that needs to be done to use mixed precision is to set runtime.mixed_precision_dtype=float16.
      
      PiperOrigin-RevId: 383767033
      be3575f5
  2. 08 Jul, 2021 3 commits
  3. 07 Jul, 2021 2 commits
  4. 05 Jul, 2021 1 commit
  5. 04 Jul, 2021 1 commit
  6. 03 Jul, 2021 1 commit
  7. 02 Jul, 2021 2 commits
  8. 01 Jul, 2021 7 commits
  9. 30 Jun, 2021 4 commits
  10. 29 Jun, 2021 2 commits
  11. 28 Jun, 2021 1 commit
  12. 25 Jun, 2021 5 commits
  13. 24 Jun, 2021 8 commits
  14. 23 Jun, 2021 2 commits
    • Vincent Dumoulin's avatar
      Internal change · 4c99ab71
      Vincent Dumoulin authored
      PiperOrigin-RevId: 381089283
      4c99ab71
    • Reed Wanderman-Milne's avatar
      Improve error message when certain flags are not specified. · 8b47c484
      Reed Wanderman-Milne authored
      In nlp/train.py and vision/beta/train.py, certain flags are marked as required. Additionally, in certain functions, error messages are improved if a necessary flag is not specified, which is a fallback in case a file calling define_flags() does not mark the necessary flags are required. Previously if any of these flags were not specified, it would crash with a cryptic error message, making it hard to tell what went wrong.
      
      In a subsequent change, I will mark flags as required in more files which call define_flags().
      
      PiperOrigin-RevId: 381066985
      8b47c484