- 05 Feb, 2019 1 commit
-
-
thomwolf authored
-
- 19 Jan, 2019 1 commit
-
-
liangtaiwan authored
-
- 16 Jan, 2019 2 commits
-
-
Thomas Wolf authored
(very) minor update to README
-
Davide Fiocco authored
-
- 14 Jan, 2019 5 commits
-
-
Thomas Wolf authored
Fix importing unofficial TF models
-
Thomas Wolf authored
lm_finetuning compatibility with Python 3.5
-
Thomas Wolf authored
Fix documentation (missing backslashes)
-
Thomas Wolf authored
[bug fix] args.do_lower_case is always True
-
nhatchan authored
Importing unofficial TF models seems to be working well, at least for me. This PR resolves #50.
-
- 13 Jan, 2019 3 commits
-
-
nhatchan authored
This PR adds missing backslashes in LM Fine-tuning subsection in README.md.
-
nhatchan authored
dicts are not ordered in Python 3.5 or prior, which is a cause of #175. This PR replaces one with a list, to keep its order.
-
Li Dong authored
The "default=True" makes args.do_lower_case always True. ```python parser.add_argument("--do_lower_case", default=True, action='store_true') ```
-
- 11 Jan, 2019 2 commits
-
-
Thomas Wolf authored
add do_lower_case arg and adjust model saving for lm finetuning.
-
tholor authored
-
- 10 Jan, 2019 3 commits
-
-
Thomas Wolf authored
Added Squad 2.0
-
Thomas Wolf authored
Fix it to run properly even if without `--do_train` param.
-
Sang-Kil Park authored
It was modified similar to `run_classifier.py`, and Fixed to run properly even if without `--do_train` param.
-
- 09 Jan, 2019 1 commit
-
-
Thomas Wolf authored
Never split some texts.
-
- 08 Jan, 2019 3 commits
- 07 Jan, 2019 11 commits
-
-
thomwolf authored
-
thomwolf authored
-
Thomas Wolf authored
LayerNorm initialization
-
Thomas Wolf authored
Fix error when `bert_model` param is path or url.
-
Thomas Wolf authored
Allow do_eval to be used without do_train and to use the pretrained model in the output folder
-
Thomas Wolf authored
Adding new pretrained model to the help of the `bert_model` argument.
-
Thomas Wolf authored
Correct the wrong note
-
Thomas Wolf authored
loading saved model when n_classes != 2
-
Thomas Wolf authored
Fixing various class documentations.
-
Thomas Wolf authored
Add example for fine tuning BERT language model
-
Li Dong authored
The LayerNorm gamma and beta should be initialized by .fill_(1.0) and .zero_(). reference links: https://github.com/tensorflow/tensorflow/blob/989e78c412a7e0f5361d4d7dfdfb230c8136e749/tensorflow/contrib/layers/python/layers/layers.py#L2298 https://github.com/tensorflow/tensorflow/blob/989e78c412a7e0f5361d4d7dfdfb230c8136e749/tensorflow/contrib/layers/python/layers/layers.py#L2308
-
- 05 Jan, 2019 1 commit
-
-
Sang-Kil Park authored
Error occurs when `bert_model` param is path or url. Therefore, if it is path, specify the last path to prevent error.
-
- 03 Jan, 2019 4 commits
-
-
Jade Abbott authored
-
Jade Abbott authored
-
Jade Abbott authored
-
Jade Abbott authored
-
- 02 Jan, 2019 1 commit
-
-
Gr茅gory Ch芒tel authored
-
- 22 Dec, 2018 1 commit
-
-
wlhgtc authored
-
- 20 Dec, 2018 1 commit
-
-
Jasdeep Singh authored
Required to for: Assertion `t >= 0 && t < n_classes` failed, if your default number of classes is not 2.
-