- 21 Apr, 2019 1 commit
-
-
Sangwhan Moon authored
-
- 16 Apr, 2019 2 commits
-
-
Ben Mann authored
-
Abhi Sharma authored
-
- 15 Apr, 2019 7 commits
- 12 Apr, 2019 1 commit
-
-
Matthew Carrigan authored
error for users whose corpus is just one giant document.
-
- 11 Apr, 2019 2 commits
- 09 Apr, 2019 2 commits
-
-
Yaroslav Bulatov authored
Fix for ```> > > > 04/09/2019 21:39:38 - INFO - __main__ - device: cuda n_gpu: 1, distributed training: False, 16-bits training: False Traceback (most recent call last): File "/home/ubuntu/pytorch-pretrained-BERT/examples/lm_finetuning/simple_lm_finetuning.py", line 642, in <module> main() File "/home/ubuntu/pytorch-pretrained-BERT/examples/lm_finetuning/simple_lm_finetuning.py", line 502, in main raise ValueError("Training is currently the only implemented execution option. Please set `do_train`.") ValueError: Training is currently the only implemented execution option. Please set `do_train`. ``` -
Benjamin Mann authored
-
- 07 Apr, 2019 2 commits
-
-
Dhanajit Brahma authored
-
dhanajitb authored
```while not args.unconditional: if not args.unconditional: ``` These lines have been updated
-
- 03 Apr, 2019 1 commit
-
-
thomwolf authored
-
- 01 Apr, 2019 1 commit
-
-
Mike Arpaia authored
-
- 30 Mar, 2019 2 commits
-
-
Weixin Wang authored
Modify 'unambigiously' to 'unambiguously'
-
jeonsworld authored
If the value of rand_end is returned from the randint function, the value of sampled_doc_index that matches current_idx is returned from searchsorted. example: cumsum_max = {int64} 30 doc_cumsum = {ndarray} [ 5 7 11 19 30] doc_lengths = {list} <class 'list'>: [5, 2, 4, 8, 11] if current_idx = 1, rand_start = 7 rand_end = 35 sentence_index = randint(7, 35) % cumsum_max if randint return 35, sentence_index becomes 5. if sentence_index is 5, np.searchsorted returns 1 equal to current_index.
-
- 28 Mar, 2019 1 commit
-
-
dhanajitb authored
The unconditional generation works now but if the seed is fixed, the sample is the same every time. n_samples > 1 will give different samples though. I am giving the start token as '<|endoftext|>' for the unconditional generation.
-
- 27 Mar, 2019 2 commits
- 25 Mar, 2019 2 commits
-
-
Matthew Carrigan authored
-
Matthew Carrigan authored
-
- 21 Mar, 2019 10 commits
-
-
Matthew Carrigan authored
order.
-
Matthew Carrigan authored
data on disc as a memmap rather than in memory
-
Matthew Carrigan authored
data on disc as a memmap rather than in memory
-
Matthew Carrigan authored
-
Matthew Carrigan authored
-
Matthew Carrigan authored
-
Matthew Carrigan authored
out on the fly without shuffling - the Sampler in the finetuning script will shuffle for us.
-
Matthew Carrigan authored
out on the fly without shuffling - the Sampler in the finetuning script will shuffle for us.
-
Matthew Carrigan authored
-
Yuqiang Xie authored
Maybe better.
-
- 20 Mar, 2019 4 commits
-
-
Matthew Carrigan authored
-
Matthew Carrigan authored
-
Matthew Carrigan authored
-
Matthew Carrigan authored
-