1. 01 Jun, 2019 2 commits
  2. 16 Apr, 2019 1 commit
    • Abhi Sharma's avatar
      Fix gradient overflow issue during attention mask · 9e666aaa
      Abhi Sharma authored
      This fix is in reference to issue #382. GPT2 can now be trained in mixed precision, which I've confirmed with testing. I also tested unconditional generation on multiple seeds before and after changing 1e10 to 1e4 and there was no difference. Please let me know if there is anything else I can do to make this pull request better. Thanks for all your work!
      9e666aaa
  3. 15 Apr, 2019 3 commits
  4. 27 Mar, 2019 1 commit
  5. 24 Mar, 2019 5 commits
  6. 14 Mar, 2019 1 commit
  7. 06 Mar, 2019 1 commit
  8. 23 Feb, 2019 1 commit
  9. 22 Feb, 2019 1 commit
  10. 18 Feb, 2019 2 commits
  11. 17 Feb, 2019 3 commits
  12. 09 Feb, 2019 1 commit
  13. 08 Feb, 2019 3 commits
  14. 07 Feb, 2019 1 commit
  15. 05 Feb, 2019 1 commit
  16. 29 Jan, 2019 3 commits
  17. 28 Jan, 2019 3 commits
  18. 15 Jan, 2019 1 commit
  19. 08 Jan, 2019 4 commits
  20. 07 Jan, 2019 1 commit