"tests/test_modeling_distilbert.py" did not exist on "9d0a11a68c8ec41a36de3bd5f20b5f083ea4c59e"
- 31 Jan, 2020 5 commits
-
-
Julien Chaumond authored
* [Umberto] model shortcuts cc @loretoparisi @simonefrancia see #2485 * Ensure that tokenizers will be correctly configured
-
Julien Chaumond authored
-
Julien Chaumond authored
-
Julien Chaumond authored
-
Julien Chaumond authored
cc @lysandrejik
-
- 30 Jan, 2020 11 commits
-
-
Jared Nielsen authored
-
Lysandre authored
-
Julien Chaumond authored
* fill_mask helper * [poc] FillMaskPipeline * Revert "[poc] FillMaskPipeline" This reverts commit 67eeea55b0f97b46c2b828de0f4ee97d87338335. * Revert "fill_mask helper" This reverts commit cacc17b884e14bb6b07989110ffe884ad9e36eaa. * README: clarify that Pipelines can also do text-classification cf. question at the AI&ML meetup last week, @mfuntowicz * Fix test: test feature-extraction pipeline * Test tweaks * Slight refactor of existing pipeline (in preparation of new FillMaskPipeline) * Extraneous doc * More robust way of doing this @mfuntowicz as we don't rely on the model name anymore (see AutoConfig) * Also add RobertaConfig as a quickfix for wrong token_type_ids * cs * [BIG] FillMaskPipeline
-
Hang Le authored
-
Lysandre authored
-
Lysandre authored
-
Lysandre authored
-
Lysandre authored
-
Lysandre authored
-
Hang Le authored
-
Peter Izsak authored
-
- 29 Jan, 2020 14 commits
-
-
Bram Vanroy authored
Requesting pad_token_id would cause an error message when it is None. Use private _pad_token instead.
-
BramVanroy authored
In batch_encode_plus we have to ensure that the tokenizer has a pad_token_id so that, when padding, no None values are added as padding. That would happen with gpt2, openai, transfoxl. closes https://github.com/huggingface/transformers/issues/2640
-
Lysandre authored
-
Lysandre authored
-
Jared Nielsen authored
-
Lysandre authored
-
Julien Plu authored
-
Julien Plu authored
-
Julien Plu authored
-
Julien Plu authored
-
Lysandre authored
-
Lysandre authored
-
Julien Plu authored
-
Julien Plu authored
-
- 28 Jan, 2020 10 commits
-
-
BramVanroy authored
- mostly stylistic streamlining - removed 'additional context' sections. They seem to be rarely used and might cause confusion. If more details are needed, users can add them to the 'details' section
-
BramVanroy authored
-
BramVanroy authored
-
BramVanroy authored
Motivate users to @-tag authors of models to increase visibility and expand the community
-
BramVanroy authored
- change references to pytorch-transformers to transformers - link to code formatting guidelines
-
BramVanroy authored
- add 'your contribution' section - add code formatting link to 'additional context'
-
BramVanroy authored
Prefer that general questions are asked on Stack Overflow
-
BramVanroy authored
Streamlines usages of pytorch-transformers and pytorch-pretrained-bert. Add link to the README for the migration guide.
-
Lysandre authored
-
Lysandre authored
cc @julien-c @thomwolf
-