1. 17 Oct, 2019 6 commits
  2. 16 Oct, 2019 5 commits
  3. 15 Oct, 2019 6 commits
  4. 14 Oct, 2019 1 commit
  5. 13 Oct, 2019 2 commits
  6. 12 Oct, 2019 2 commits
  7. 11 Oct, 2019 8 commits
  8. 10 Oct, 2019 8 commits
  9. 09 Oct, 2019 2 commits
    • Reed Wanderman-Milne's avatar
      Simply LayerNorm mixed precision logic. · 0257b276
      Reed Wanderman-Milne authored
      Instead of needing to ensure variables are float32, casting inputs to float32, etc, instead dtype="float32" is passed to the layer constructor, which will do all that logic automatically.
      
      The only difference is the output of LayerNorm is now float32 instead of float16, so an extra cast is needed elsewhere.
      
      PiperOrigin-RevId: 273833286
      0257b276
    • A. Unique TensorFlower's avatar
      Fix BERT benchmark test. · f5e2211f
      A. Unique TensorFlower authored
      PiperOrigin-RevId: 273795511
      f5e2211f