"examples/pytorch/text-generation/run_generation.py" did not exist on "e57d00ee108595375504eb21c230ce35428aae5e"
  1. 25 Jan, 2020 1 commit
  2. 07 Jan, 2020 1 commit
  3. 19 Dec, 2019 1 commit
  4. 18 Dec, 2019 1 commit
  5. 17 Dec, 2019 2 commits
  6. 16 Dec, 2019 1 commit
  7. 14 Dec, 2019 2 commits
  8. 12 Dec, 2019 2 commits
  9. 10 Dec, 2019 1 commit
  10. 03 Dec, 2019 1 commit
  11. 02 Dec, 2019 1 commit
  12. 29 Nov, 2019 1 commit
  13. 27 Nov, 2019 2 commits
  14. 25 Nov, 2019 2 commits
  15. 22 Nov, 2019 1 commit
  16. 21 Nov, 2019 1 commit
  17. 18 Nov, 2019 1 commit
  18. 14 Nov, 2019 1 commit
  19. 11 Nov, 2019 1 commit
  20. 29 Oct, 2019 1 commit
  21. 24 Oct, 2019 2 commits
  22. 11 Oct, 2019 3 commits
  23. 10 Oct, 2019 2 commits
  24. 09 Oct, 2019 1 commit
    • Reed Wanderman-Milne's avatar
      Simply LayerNorm mixed precision logic. · 0257b276
      Reed Wanderman-Milne authored
      Instead of needing to ensure variables are float32, casting inputs to float32, etc, instead dtype="float32" is passed to the layer constructor, which will do all that logic automatically.
      
      The only difference is the output of LayerNorm is now float32 instead of float16, so an extra cast is needed elsewhere.
      
      PiperOrigin-RevId: 273833286
      0257b276
  25. 07 Oct, 2019 1 commit
  26. 18 Sep, 2019 1 commit
  27. 17 Sep, 2019 1 commit
  28. 16 Sep, 2019 1 commit
  29. 09 Sep, 2019 2 commits
  30. 05 Sep, 2019 1 commit