- 16 Aug, 2019 1 commit
-
-
wangfei authored
-
- 18 Jul, 2019 2 commits
- 17 Jul, 2019 3 commits
- 14 Jul, 2019 1 commit
-
-
thomwolf authored
-
- 05 Jul, 2019 1 commit
-
-
thomwolf authored
-
- 03 Jul, 2019 1 commit
-
-
thomwolf authored
-
- 26 Jun, 2019 1 commit
-
-
Mayhul Arora authored
-
- 22 Jun, 2019 1 commit
-
-
Rocketknight1 authored
-
- 11 Jun, 2019 1 commit
-
-
Oliver Guhr authored
-
- 10 Jun, 2019 1 commit
-
-
jeonsworld authored
apply Whole Word Masking technique. referred to [create_pretraining_data.py](https://github.com/google-research/bert/blob/master/create_pretraining_data.py)
-
- 27 May, 2019 1 commit
-
-
Ahmad Barqawi authored
fix issue of bert-base-multilingual and add support for uncased multilingual
-
- 09 May, 2019 2 commits
-
-
burcturkoglu authored
-
burcturkoglu authored
-
- 02 May, 2019 1 commit
-
-
MottoX authored
-
- 23 Apr, 2019 1 commit
-
-
thomwolf authored
-
- 22 Apr, 2019 1 commit
-
-
Matthew Carrigan authored
-
- 12 Apr, 2019 1 commit
-
-
Matthew Carrigan authored
error for users whose corpus is just one giant document.
-
- 11 Apr, 2019 1 commit
-
-
thomwolf authored
-
- 09 Apr, 2019 1 commit
-
-
Yaroslav Bulatov authored
Fix for ```> > > > 04/09/2019 21:39:38 - INFO - __main__ - device: cuda n_gpu: 1, distributed training: False, 16-bits training: False Traceback (most recent call last): File "/home/ubuntu/pytorch-pretrained-BERT/examples/lm_finetuning/simple_lm_finetuning.py", line 642, in <module> main() File "/home/ubuntu/pytorch-pretrained-BERT/examples/lm_finetuning/simple_lm_finetuning.py", line 502, in main raise ValueError("Training is currently the only implemented execution option. Please set `do_train`.") ValueError: Training is currently the only implemented execution option. Please set `do_train`. ```
-
- 30 Mar, 2019 1 commit
-
-
jeonsworld authored
If the value of rand_end is returned from the randint function, the value of sampled_doc_index that matches current_idx is returned from searchsorted. example: cumsum_max = {int64} 30 doc_cumsum = {ndarray} [ 5 7 11 19 30] doc_lengths = {list} <class 'list'>: [5, 2, 4, 8, 11] if current_idx = 1, rand_start = 7 rand_end = 35 sentence_index = randint(7, 35) % cumsum_max if randint return 35, sentence_index becomes 5. if sentence_index is 5, np.searchsorted returns 1 equal to current_index.
-
- 27 Mar, 2019 2 commits
- 25 Mar, 2019 2 commits
-
-
Matthew Carrigan authored
-
Matthew Carrigan authored
-
- 21 Mar, 2019 9 commits
-
-
Matthew Carrigan authored
order.
-
Matthew Carrigan authored
data on disc as a memmap rather than in memory
-
Matthew Carrigan authored
data on disc as a memmap rather than in memory
-
Matthew Carrigan authored
-
Matthew Carrigan authored
-
Matthew Carrigan authored
-
Matthew Carrigan authored
out on the fly without shuffling - the Sampler in the finetuning script will shuffle for us.
-
Matthew Carrigan authored
out on the fly without shuffling - the Sampler in the finetuning script will shuffle for us.
-
Matthew Carrigan authored
-
- 20 Mar, 2019 4 commits
-
-
Matthew Carrigan authored
-
Matthew Carrigan authored
-
Matthew Carrigan authored
-
Matthew Carrigan authored
-