- 04 Feb, 2020 4 commits
- 03 Feb, 2020 5 commits
-
-
Julien Chaumond authored
cc @mfuntowicz does this seem correct?
-
Lysandre authored
-
Lysandre authored
Masked indices should have -1 and not -100. Updating documentation + scripts that were forgotten
-
Martin Malmsten authored
-
Julien Plu authored
-
- 01 Feb, 2020 2 commits
-
-
Antonio Carlos Falc茫o Petri authored
"%s-%d".format() -> "{}-{}".format() -
Bram Vanroy authored
* add "info" command to CLI As a convenience, add the info directive to CLI. Running `python transformers-cli info` will return a string containing the transformers version, platform, python version, PT/TF version and GPU support * Swap f-strings for .format Still supporting 3.5 so can't use f-strings (sad face) * Add reference in issue to CLI * Add the expected fields to issue template This way, people can still add the information manually if they want. (Though I fear they'll just ignore it.) * Remove heading from output * black-ify * order of imports Should ensure isort test passes * use is_X_available over import..pass * style * fix copy-paste bug * Rename command info -> env Also adds the command to CONTRIBUTING.md in "Did you find a bug" section
-
- 31 Jan, 2020 18 commits
-
-
Julien Chaumond authored
Co-Authored-By:Stefan Schweter <stefan-it@users.noreply.github.com>
-
Julien Chaumond authored
Co-Authored-By:HenrykBorzymowski <henrykborzymowski@users.noreply.github.com>
-
Julien Chaumond authored
Co-Authored-By:
Loreto Parisi <loretoparisi@gmail.com> Co-Authored-By:
Simone Francia <francia.simone1@gmail.com>
-
Julien Chaumond authored
-
Lysandre authored
-
Lysandre authored
cc @julien-c
-
Lysandre authored
-
Lysandre authored
The FlauBERT configuration file inherits from XLMConfig, and is recognized as such when loading from AutoModels as the XLMConfig is checked before the FlaubertConfig. Changing the order solves this problem, but a test should be added.
-
Lysandre authored
-
Arnaud authored
-
Lysandre authored
-
Lysandre authored
-
Lysandre authored
-
Julien Chaumond authored
* [Umberto] model shortcuts cc @loretoparisi @simonefrancia see #2485 * Ensure that tokenizers will be correctly configured
-
Julien Chaumond authored
-
Julien Chaumond authored
-
Julien Chaumond authored
-
Julien Chaumond authored
cc @lysandrejik
-
- 30 Jan, 2020 11 commits
-
-
Jared Nielsen authored
-
Lysandre authored
-
Julien Chaumond authored
* fill_mask helper * [poc] FillMaskPipeline * Revert "[poc] FillMaskPipeline" This reverts commit 67eeea55b0f97b46c2b828de0f4ee97d87338335. * Revert "fill_mask helper" This reverts commit cacc17b884e14bb6b07989110ffe884ad9e36eaa. * README: clarify that Pipelines can also do text-classification cf. question at the AI&ML meetup last week, @mfuntowicz * Fix test: test feature-extraction pipeline * Test tweaks * Slight refactor of existing pipeline (in preparation of new FillMaskPipeline) * Extraneous doc * More robust way of doing this @mfuntowicz as we don't rely on the model name anymore (see AutoConfig) * Also add RobertaConfig as a quickfix for wrong token_type_ids * cs * [BIG] FillMaskPipeline
-
Hang Le authored
-
Lysandre authored
-
Lysandre authored
-
Lysandre authored
-
Lysandre authored
-
Lysandre authored
-
Hang Le authored
-
Peter Izsak authored
-