1. 13 Apr, 2021 2 commits
    • A. Unique TensorFlower's avatar
      Internal change · b1aa44d9
      A. Unique TensorFlower authored
      PiperOrigin-RevId: 368260712
      b1aa44d9
    • Reed Wanderman-Milne's avatar
      Use nonexperimental mixed precision API. · 4334a892
      Reed Wanderman-Milne authored
      This replaces symbols in tf.keras.mixed_precision.experimental with the corresponding nonexperimental symbols. In some cases, passing a Policy is replaced with passing a policy name for conciseness.
      
      Additionally, for the Shakespeare model, the loss_scale flag is removed, since supporting it with the nonexperimental API is slightly more verbose and it is recommended users use the default loss scale.
      
      PiperOrigin-RevId: 368123944
      4334a892
  2. 12 Apr, 2021 1 commit
    • Reed Wanderman-Milne's avatar
      Use nonexperimental mixed precision API for official models. · 0d8f9807
      Reed Wanderman-Milne authored
      For all modified calls to set_mixed_precision_policy(), the loss_scale argument was removed, as it cannot be passed if the nonexperimental API is used. For all such callers, the loss_scale is later used to explicitly create a LossScaleOptimizer, so removing the argument has no impact.
      
      Switching to the non-experimental LossScaleOptimizer has no effect, as it has near identical behavior and all isinstance checks within the official models check for the non-experimental version.
      
      PiperOrigin-RevId: 368101975
      0d8f9807
  3. 09 Apr, 2021 1 commit
    • Reed Wanderman-Milne's avatar
      Remove dynamic_loss_scale argument to define_performance. · e353e4e5
      Reed Wanderman-Milne authored
      All models which support loss scaling support dynamic loss scaling, so the argument has no purpose. It used to be that some models scaled the loss manually instead of using a LossScaleOptimizer, and so did not support dynamic loss scaling.
      
      PiperOrigin-RevId: 367719521
      e353e4e5
  4. 08 Apr, 2021 1 commit
  5. 07 Apr, 2021 1 commit
  6. 06 Apr, 2021 2 commits
    • Hongkun Yu's avatar
      Clarify the deprecation warning for bert/ readme. · 8ccc242c
      Hongkun Yu authored
      PiperOrigin-RevId: 367101911
      8ccc242c
    • Jeremiah Liu's avatar
      Disable temperature scaling during training. · fab47e9e
      Jeremiah Liu authored
      For the `GaussianProcessClassificationHead`, the temperature scaling needs to be disabled during training to avoid unexpected modification to the learning rate, which harms model quality. (Unfortunately, this seems to require adding `training` to the `call` method).
      
      Also set the default of `gp_cov_ridge_penalty` in `RandomFeatureGaussianProcess` to 1 to be consistent with that in the `GaussianProcessClassificationHead`.
      
      PiperOrigin-RevId: 366917075
      fab47e9e
  7. 05 Apr, 2021 2 commits
  8. 01 Apr, 2021 5 commits
  9. 29 Mar, 2021 1 commit
  10. 26 Mar, 2021 3 commits
  11. 25 Mar, 2021 1 commit
  12. 24 Mar, 2021 1 commit
  13. 23 Mar, 2021 1 commit
  14. 22 Mar, 2021 4 commits
  15. 20 Mar, 2021 2 commits
  16. 18 Mar, 2021 1 commit
  17. 17 Mar, 2021 1 commit
  18. 16 Mar, 2021 3 commits
  19. 14 Mar, 2021 1 commit
  20. 12 Mar, 2021 3 commits
  21. 10 Mar, 2021 1 commit
  22. 09 Mar, 2021 1 commit
  23. 08 Mar, 2021 1 commit