- 16 Apr, 2019 1 commit
-
-
Abhi Sharma authored
This fix is in reference to issue #382. GPT2 can now be trained in mixed precision, which I've confirmed with testing. I also tested unconditional generation on multiple seeds before and after changing 1e10 to 1e4 and there was no difference. Please let me know if there is anything else I can do to make this pull request better. Thanks for all your work!
-
- 15 Apr, 2019 3 commits
- 27 Mar, 2019 1 commit
-
-
Catalin Voss authored
-
- 24 Mar, 2019 5 commits
-
-
Catalin Voss authored
-
Catalin Voss authored
-
Catalin Voss authored
-
Catalin Voss authored
-
Catalin Voss authored
-
- 14 Mar, 2019 1 commit
-
-
thomwolf authored
-
- 06 Mar, 2019 1 commit
-
-
thomwolf authored
-
- 23 Feb, 2019 1 commit
-
-
Joel Grus authored
-
- 22 Feb, 2019 1 commit
-
-
Joel Grus authored
-
- 18 Feb, 2019 2 commits
- 17 Feb, 2019 3 commits
- 09 Feb, 2019 1 commit
-
-
thomwolf authored
-
- 08 Feb, 2019 3 commits
- 07 Feb, 2019 1 commit
-
-
thomwolf authored
-
- 05 Feb, 2019 1 commit
-
-
thomwolf authored
-
- 29 Jan, 2019 3 commits
- 28 Jan, 2019 3 commits
- 15 Jan, 2019 1 commit
-
-
thomwolf authored
-
- 08 Jan, 2019 4 commits
- 07 Jan, 2019 1 commit
-
-
thomwolf authored
-