"INSTALL/vscode:/vscode.git/clone" did not exist on "fd0d335eb69950ceaed69adeac72064987cd79b9"
- 06 May, 2020 1 commit
-
-
Clement authored
-
- 30 Apr, 2020 1 commit
-
-
Jared T Nielsen authored
-
- 23 Apr, 2020 1 commit
-
-
Clement authored
-
- 22 Apr, 2020 1 commit
-
-
Julien Chaumond authored
* doc * [tests] Add sample files for a regression task * [HUGE] Trainer * Feedback from @sshleifer * Feedback from @thomwolf + logging tweak * [file_utils] when downloading concurrently, get_from_cache will use the cached file for subsequent processes * [glue] Use default max_seq_length of 128 like before * [glue] move DataTrainingArguments around * [ner] Change interface of InputExample, and align run_{tf,pl} * Re-align the pl scripts a little bit * ner * [ner] Add integration test * Fix language_modeling with API tweak * [ci] Tweak loss target * Don't break console output * amp.initialize: model must be on right device before * [multiple-choice] update for Trainer * Re-align to 827d6d6e
-
- 18 Apr, 2020 1 commit
-
-
Patrick von Platen authored
-
- 16 Apr, 2020 1 commit
-
-
Patrick von Platen authored
* add dialoGPT * update README.md * fix conflict * update readme * add code links to docs * Update README.md * Update dialo_gpt2.rst * Update pretrained_models.rst * Update docs/source/model_doc/dialo_gpt2.rst Co-Authored-By:
Julien Chaumond <chaumond@gmail.com> * change filename of dialogpt Co-authored-by:
Julien Chaumond <chaumond@gmail.com>
-
- 10 Apr, 2020 1 commit
-
-
Julien Chaumond authored
-
- 08 Apr, 2020 1 commit
-
-
Julien Chaumond authored
-
- 03 Apr, 2020 1 commit
-
-
Lysandre Debut authored
* Electra wip * helpers * Electra wip * Electra v1 * ELECTRA may be saved/loaded * Generator & Discriminator * Embedding size instead of halving the hidden size * ELECTRA Tokenizer * Revert BERT helpers * ELECTRA Conversion script * Archive maps * PyTorch tests * Start fixing tests * Tests pass * Same configuration for both models * Compatible with base + large * Simplification + weight tying * Archives * Auto + Renaming to standard names * ELECTRA is uncased * Tests * Slight API changes * Update tests * wip * ElectraForTokenClassification * temp * Simpler arch + tests Removed ElectraForPreTraining which will be in a script * Conversion script * Auto model * Update links to S3 * Split ElectraForPreTraining and ElectraForTokenClassification * Actually test PreTraining model * Remove num_labels from configuration * wip * wip * From discriminator and generator to electra * Slight API changes * Better naming * TensorFlow ELECTRA tests * Accurate conversion script * Added to conversion script * Fast ELECTRA tokenizer * Style * Add ELECTRA to README * Modeling Pytorch Doc + Real style * TF Docs * Docs * Correct links * Correct model intialized * random fixes * style * Addressing Patrick's and Sam's comments * Correct links in docs
-
- 17 Mar, 2020 1 commit
-
-
Thomas Wolf authored
* memory benchmark rss * have both forward pass and line-by-line mem tracing * cleaned up tracing * refactored and cleaning up API * no f-strings yet... * add GPU mem logging * fix GPU memory monitoring * style and quality * clean up and doc * update with comments * Switching to python 3.6+ * fix quality
-
- 12 Mar, 2020 1 commit
-
-
Sam Shleifer authored
-
- 10 Mar, 2020 2 commits
-
-
Julien Chaumond authored
Co-Authored-By:Thomas Wolf <thomwolf@users.noreply.github.com>
-
Julien Chaumond authored
-
- 20 Feb, 2020 1 commit
-
-
Santiago Castro authored
-
- 19 Feb, 2020 1 commit
-
-
Lysandre authored
-
- 07 Feb, 2020 1 commit
-
-
VictorSanh authored
-
- 06 Feb, 2020 1 commit
-
-
Clement authored
powered by https://github.com/sourcerer-io/hall-of-fame
-
- 05 Feb, 2020 1 commit
-
-
Julien Chaumond authored
cc @lysandrejik @thomwolf
-
- 31 Jan, 2020 2 commits
- 30 Jan, 2020 2 commits
-
-
Julien Chaumond authored
* fill_mask helper * [poc] FillMaskPipeline * Revert "[poc] FillMaskPipeline" This reverts commit 67eeea55b0f97b46c2b828de0f4ee97d87338335. * Revert "fill_mask helper" This reverts commit cacc17b884e14bb6b07989110ffe884ad9e36eaa. * README: clarify that Pipelines can also do text-classification cf. question at the AI&ML meetup last week, @mfuntowicz * Fix test: test feature-extraction pipeline * Test tweaks * Slight refactor of existing pipeline (in preparation of new FillMaskPipeline) * Extraneous doc * More robust way of doing this @mfuntowicz as we don't rely on the model name anymore (see AutoConfig) * Also add RobertaConfig as a quickfix for wrong token_type_ids * cs * [BIG] FillMaskPipeline
-
Hang Le authored
-
- 23 Jan, 2020 1 commit
-
-
Julien Chaumond authored
-
- 06 Jan, 2020 2 commits
-
-
alberduris authored
-
alberduris authored
-
- 05 Jan, 2020 2 commits
-
-
Julien Chaumond authored
-
Clement authored
-
- 28 Dec, 2019 1 commit
-
-
Julien Chaumond authored
-
- 27 Dec, 2019 1 commit
-
-
Aymeric Augustin authored
This ensures compatibility with zsh. Fix #2316.
-
- 24 Dec, 2019 1 commit
-
-
Aymeric Augustin authored
Use -e only in docs targeted at contributors. If a user copy-pastes command line with [--editable], they will hit an error. If they don't know the --editable option, we're giving them a choice to make before they can move forwards, but this isn't a choice they need to make right now.
-
- 23 Dec, 2019 1 commit
-
-
Aymeric Augustin authored
Also provide shortcuts in a Makefile.
-
- 22 Dec, 2019 5 commits
-
-
Aymeric Augustin authored
-
Aymeric Augustin authored
-
Aymeric Augustin authored
-
Aymeric Augustin authored
-
Aymeric Augustin authored
-
- 20 Dec, 2019 3 commits
-
-
Lysandre authored
-
thomwolf authored
-
Morgan Funtowicz authored
-
- 18 Dec, 2019 1 commit
-
-
Stefan Schweter authored
-