"examples/run_bert_classifier.py" did not exist on "a3a3180c86f63ee7af9d5a2a6ce1c05a8a3385b4"
- 24 Mar, 2019 5 commits
-
-
Catalin Voss authored
-
Catalin Voss authored
-
Catalin Voss authored
-
Catalin Voss authored
-
Catalin Voss authored
-
- 14 Mar, 2019 1 commit
-
-
thomwolf authored
-
- 09 Mar, 2019 1 commit
-
-
Bharat Raghunathan authored
-
- 07 Mar, 2019 2 commits
-
-
Haozhe Ji authored
-
Philipp Glock authored
-
- 06 Mar, 2019 1 commit
-
-
thomwolf authored
-
- 05 Mar, 2019 1 commit
-
-
Catalin Voss authored
-
- 04 Mar, 2019 1 commit
-
-
Aaron Mangum authored
-
- 03 Mar, 2019 1 commit
-
-
Catalin Voss authored
For many applications requiring randomized data access, it's easier to cache the tokenized representations than the words. So why not turn this into a warning?
-
- 27 Feb, 2019 9 commits
-
-
John Hewitt authored
-
lukovnikov authored
-
lukovnikov authored
-
lukovnikov authored
-
lukovnikov authored
-
lukovnikov authored
-
lukovnikov authored
-
lukovnikov authored
-
John Hewitt authored
-
- 26 Feb, 2019 2 commits
-
-
lukovnikov authored
fix for negative learning rate with warmup_linear in BertAdam (happens when t_total is specified incorrectly) + copied BERT optimization warmup functions to OpenAI optimization file + added comments
-
lukovnikov authored
fix for negative learning rate with warmup_linear in BertAdam (happens when t_total is specified incorrectly) + copied BERT optimization warmup functions to OpenAI optimization file + added comments
-
- 23 Feb, 2019 1 commit
-
-
Joel Grus authored
-
- 22 Feb, 2019 1 commit
-
-
Joel Grus authored
-
- 20 Feb, 2019 2 commits
-
-
Yongbo Wang authored
-
Yongbo Wang authored
-
- 18 Feb, 2019 6 commits
- 17 Feb, 2019 4 commits
- 16 Feb, 2019 1 commit
-
-
Dan Hendrycks authored
-
- 13 Feb, 2019 1 commit
-
-
thomwolf authored
-