- 06 Dec, 2019 1 commit
-
-
patrickvonplaten authored
-
- 05 Dec, 2019 17 commits
-
-
VictorSanh authored
-
VictorSanh authored
-
Rosanne Liu authored
* license * changes * ok * Update paper link and commands to run * pointer to uber repo
-
Thomas Wolf authored
Remove dead code in tests.
-
thomwolf authored
-
Thomas Wolf authored
Fixing camembert tokenization
-
thomwolf authored
-
thomwolf authored
-
Thomas Wolf authored
typo fix on the docs as per Pytorch v1.1+
-
Thomas Wolf authored
fixed XLNet attention output for both attention streams whenever target_mapping is provided
-
thomwolf authored
-
Thomas Wolf authored
Do not use GPU when importing transformers
-
thomwolf authored
-
thomwolf authored
-
Thomas Wolf authored
XLNet for Token classification
-
Thomas Wolf authored
CLI for authenticated file sharing
-
VictorSanh authored
-
- 04 Dec, 2019 6 commits
-
-
Julien Chaumond authored
-
Julien Chaumond authored
-
thomwolf authored
-
Thomas Wolf authored
fix summary_type value of SequenceSummary
-
Aymeric Augustin authored
-
Julien Chaumond authored
-
- 03 Dec, 2019 16 commits
-
-
Julien Chaumond authored
-
LysandreJik authored
-
VictorSanh authored
-
Julien Chaumond authored
Co-Authored-By:Piero Molino <w4nderlust@gmail.com>
-
Ethan Perez authored
When evaluating, shouldn't we always use the SequentialSampler instead of DistributedSampler? Evaluation only runs on 1 GPU no matter what, so if you use the DistributedSampler with N GPUs, I think you'll only evaluate on 1/N of the evaluation set. That's at least what I'm finding when I run an older/modified version of this repo.
-
Julien Chaumond authored
-
Julien Chaumond authored
Co-Authored-By:Rosanne Liu <mimosavvy@gmail.com>
-
Julien Chaumond authored
-
Piero Molino authored
-
w4nderlust authored
-
w4nderlust authored
-
w4nderlust authored
Imrpovements: model_path renamed pretrained_model, tokenizer loaded from pretrained_model, pretrained_model set to discriminator's when discrim is specified, sample = False by default but cli parameter introduced. To obtain identical samples call the cli with --sample
-
w4nderlust authored
-
piero authored
-
piero authored
-
piero authored
-