- 30 Jan, 2020 8 commits
- 29 Jan, 2020 14 commits
-
-
Bram Vanroy authored
Requesting pad_token_id would cause an error message when it is None. Use private _pad_token instead.
-
BramVanroy authored
In batch_encode_plus we have to ensure that the tokenizer has a pad_token_id so that, when padding, no None values are added as padding. That would happen with gpt2, openai, transfoxl. closes https://github.com/huggingface/transformers/issues/2640
-
Lysandre authored
-
Lysandre authored
-
Jared Nielsen authored
-
Lysandre authored
-
Julien Plu authored
-
Julien Plu authored
-
Julien Plu authored
-
Julien Plu authored
-
Lysandre authored
-
Lysandre authored
-
Julien Plu authored
-
Julien Plu authored
-
- 28 Jan, 2020 15 commits
-
-
BramVanroy authored
- mostly stylistic streamlining - removed 'additional context' sections. They seem to be rarely used and might cause confusion. If more details are needed, users can add them to the 'details' section
-
BramVanroy authored
-
BramVanroy authored
-
BramVanroy authored
Motivate users to @-tag authors of models to increase visibility and expand the community
-
BramVanroy authored
- change references to pytorch-transformers to transformers - link to code formatting guidelines
-
BramVanroy authored
- add 'your contribution' section - add code formatting link to 'additional context'
-
BramVanroy authored
Prefer that general questions are asked on Stack Overflow
-
BramVanroy authored
Streamlines usages of pytorch-transformers and pytorch-pretrained-bert. Add link to the README for the migration guide.
-
Lysandre authored
-
Lysandre authored
cc @julien-c @thomwolf
-
Wietse de Vries authored
-
Wietse de Vries authored
-
Julien Chaumond authored
ping @lysandrejik
-
Julien Chaumond authored
-
Julien Chaumond authored
-
- 27 Jan, 2020 3 commits