- 22 Apr, 2019 2 commits
-
-
Matthew Carrigan authored
-
Matt authored
Pulling commits from main repo
-
- 17 Apr, 2019 14 commits
-
-
Thomas Wolf authored
Updating network handling
-
thomwolf authored
-
thomwolf authored
-
thomwolf authored
-
thomwolf authored
-
thomwolf authored
-
thomwolf authored
-
thomwolf authored
-
Thomas Wolf authored
Fix indentation for unconditional generation
-
Thomas Wolf authored
Fix gradient overflow issue during attention mask
-
Thomas Wolf authored
[run_gpt2.py] temperature should be a float, not int
-
Thomas Wolf authored
Gpt2 tokenization
-
thomwolf authored
-
thomwolf authored
-
- 16 Apr, 2019 6 commits
-
-
Ben Mann authored
-
Abhi Sharma authored
This fix is in reference to issue #382. GPT2 can now be trained in mixed precision, which I've confirmed with testing. I also tested unconditional generation on multiple seeds before and after changing 1e10 to 1e4 and there was no difference. Please let me know if there is anything else I can do to make this pull request better. Thanks for all your work!
-
Abhi Sharma authored
-
thomwolf authored
-
thomwolf authored
-
Thomas Wolf authored
Better serialization for Tokenizers and Configuration classes - Also fix #466
-
- 15 Apr, 2019 18 commits
-
-
thomwolf authored
-
Thomas Wolf authored
Clean up GPT and GPT-2 losses computation
-
thomwolf authored
-
thomwolf authored
-
thomwolf authored
-
thomwolf authored
-
thomwolf authored
-
thomwolf authored
-
thomwolf authored
-
thomwolf authored
-
thomwolf authored
-
thomwolf authored
-
thomwolf authored
-
thomwolf authored
-
thomwolf authored
-
thomwolf authored
-
thomwolf authored
-
Thomas Wolf authored
making unconditional generation work
-