- 15 Oct, 2020 4 commits
-
-
Lysandre authored
-
Nicolas Patry authored
- TFAutoModelForCausalLM - TFAutoModelForMaskedLM - TFAutoModelForSeq2SeqLM as per deprecation warning. No tests as it simply removes current warnings from tests.
-
Sylvain Gugger authored
-
Nicolas Patry authored
* Improving Pipelines by defaulting to framework='tf' when pytorch seems unavailable. * Actually changing the default resolution order to account for model defaults Adding a new tests for each pipeline to check that pipeline(task) works too without manually adding the framework too.
-
- 14 Oct, 2020 12 commits
-
-
Julien Plu authored
* Remove wrong parameter. * Same in Longformer
-
Nils Reimers authored
* Create README.md * Update model_cards/sentence-transformers/LaBSE/README.md Co-authored-by:
Julien Chaumond <chaumond@gmail.com> Co-authored-by:
Julien Chaumond <chaumond@gmail.com>
-
sarahlintang authored
* Create README.md * Update model_cards/sarahlintang/IndoBERT/README.md Co-authored-by:Julien Chaumond <chaumond@gmail.com>
-
Julien Chaumond authored
-
Zhuosheng Zhang authored
-
Sagor Sarker authored
-
Sylvain Gugger authored
* Don't use `store_xxx` on optional bools * Refine test * Refine test
-
Sylvain Gugger authored
* Add eval_accumulation_step and clean distributed eval * Add TPU test * Add TPU stuff * Fix arg name * Fix Seq2SeqTrainer * Fix total_size * Update src/transformers/trainer_pt_utils.py Co-authored-by:
Lysandre Debut <lysandre@huggingface.co> * Doc and add test to TPU * Add unit test * Adapt name Co-authored-by:
Lysandre Debut <lysandre@huggingface.co>
-
Sam Shleifer authored
-
XiaoqiJiao authored
-
Jonathan Chang authored
* Add support for gpt2 batch inferencing * add test * remove typo Co-authored-by:patrickvonplaten <patrick.v.platen@gmail.com>
-
Quentin Lhoest authored
* fix bert position ids in DPR convert script * style
-
- 13 Oct, 2020 10 commits
-
-
Sylvain Gugger authored
-
Sam Shleifer authored
-
François Lagunas authored
* Adding optional trial argument to model_init Co-authored-by:Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
-
Tiger authored
-
Noam Wies authored
* use DDP no_sync when possible * fix is_nlp_available addition mistake * reformat trainer.py * reformat trainer.py * drop support for pytorch < 1.2 * return support for pytorch < 1.2
-
Lysandre Debut authored
* Do not softmax when num_labels==1 * Update src/transformers/pipelines.py Co-authored-by:
Funtowicz Morgan <mfuntowicz@users.noreply.github.com> Co-authored-by:
Funtowicz Morgan <mfuntowicz@users.noreply.github.com>
-
Patrick von Platen authored
* fix rag * Update tokenizer save_pretrained Co-authored-by:Thomas Wolf <thomwolf@users.noreply.github.com>
-
Patrick von Platen authored
Putting my name on a couple more issues to directly redirect them to me
-
Felipe Curti authored
* Add Documentation for GPT-1 Classification * Add GPT-1 with Classification head * Add tests for GPT-1 Classification * Add GPT-1 For Classification to auto models * Remove authorized missing keys, change checkpoint to openai-gpt
-
Lysandre Debut authored
-
- 12 Oct, 2020 11 commits
-
-
Sam Shleifer authored
-
Alex Combessie authored
-
Lysandre Debut authored
-
Julien Plu authored
* Fix test * fix generic text classification * fix test * Fix tests
-
sgugger authored
-
Jonathan Chang authored
Fix a bug that happends when subclassing Trainer and overwriting evaluate() without calling prediciton_loop()
-
Kelvin authored
Very often splitting large files to smaller files can prevent tokenizer going out of memory in environment like Colab that does not have swap memory
-
AndreaSottana authored
Minor spelling corrections in docstrings. "information" is uncountable in English and has no plural.
-
fteufel authored
Added is_torch_tpu_available() to the condition for saving a model as xla model. "xla_device" property of config can also be True on a non-xla device, when loading a checkpointthat was trained on xla before. Resolves #7695
-
Sylvain Gugger authored
-
Berowne authored
replace 'men_len' with 'mem_len' to match parameter name
-
- 11 Oct, 2020 3 commits
-
-
Miguel Victor authored
-
Sam Shleifer authored
-
Alexandr Maslov authored
-