"docs/img/vscode:/vscode.git/clone" did not exist on "d9dd29f322ba98bb69ffd0e36451aecd0bc917b1"
- 16 Jul, 2020 5 commits
-
-
Patrick von Platen authored
-
Patrick von Platen authored
-
Martin M眉ller authored
-
Julien Chaumond authored
-
HuYong authored
* ERNIE model card * Update Readme.md * Update Readme.md * Update Readme.md * Rename Readme.md to README.md * Update README.md * Update Readme.md * Update README.md * Rename Readme.md to README.md * Update Readme.md * Update Readme.md * Rename Readme.md to README.md * Update and rename Readme.md to README.md Co-authored-by:Kevin Canwen Xu <canwenxu@126.com>
-
- 15 Jul, 2020 14 commits
-
-
Clement authored
-
Clement authored
-
Clement authored
-
Manuel Romero authored
Add cherry picked example for the widget Co-authored-by:Julien Chaumond <chaumond@gmail.com>
-
Manuel Romero authored
* Create README.md * Update model_cards/mrm8488/RoBasquERTa/README.md Co-authored-by:Julien Chaumond <chaumond@gmail.com>
-
Manuel Romero authored
* Create README.md * Apply suggestions from code review Co-authored-by:Julien Chaumond <chaumond@gmail.com>
-
Julien Chaumond authored
-
Pierre Guillou authored
* metadata * Update model_cards/pierreguillou/gpt2-small-portuguese/README.md Co-authored-by:Julien Chaumond <chaumond@gmail.com>
-
Julien Chaumond authored
cc @rodrigonogueira4 @abiocapsouza @robertoalotufo Also cc @piegu Obrigado :)
-
Sam Shleifer authored
-
Julien Chaumond authored
-
Funtowicz Morgan authored
Signed-off-by:Morgan Funtowicz <morgan@huggingface.co>
-
Sam Shleifer authored
-
Patrick von Platen authored
* fix auto model causal lm * leverage given functionality * apply unused kwargs to all auto models
-
- 14 Jul, 2020 14 commits
-
-
Julien Chaumond authored
-
Bashar Talafha authored
-
Bashar Talafha authored
-
Joe Davison authored
-
Manuel Romero authored
* Customize inference widget input * Update model_cards/mrm8488/RuPERTa-base/README.md Co-authored-by:Kevin Canwen Xu <canwenxu@126.com>
-
dartrevan authored
-
Doron Adler authored
Model card for hewiki-articles-distilGPT2py-il A tiny GPT2 model for generating Hebrew text
-
Pierre Guillou authored
-
Sam Shleifer authored
-
Sam Shleifer authored
-
Boris Dayma authored
* docs(wandb): explain how to use W&B integration fix #5262 * Also mention TensorBoard Co-authored-by:Julien Chaumond <chaumond@gmail.com>
-
Gunnlaugur Thor Briem authored
-
as-stevens authored
[Reformer classification head] Implement the reformer model classification head for text classification (#5198) * Reformer model head classification implementation for text classification * Reformat the reformer model classification code * PR review comments, and test case implementation for reformer for classification head changes * CI/CD reformer for classification head test import error fix * CI/CD test case implementation added ReformerForSequenceClassification to all_model_classes * Code formatting- fixed * Normal test cases added for reformer classification head * Fix test cases implementation for the reformer classification head * removed token_type_id parameter from the reformer classification head * fixed the test case for reformer classification head * merge conflict with master fixed * merge conflict, changed reformer classification to accept the choice_label parameter added in latest code * refactored the the reformer classification head test code * reformer classification head, common transform test cases fixed * final set of the review comment, rearranging the reformer classes and docstring add to classification forward method * fixed the compilation error and text case fix for reformer classification head * Apply suggestions from code review Remove unnecessary dup Co-authored-by:Patrick von Platen <patrick.v.platen@gmail.com>
-
Gaurav Mishra authored
Minor doc fix.
-
- 13 Jul, 2020 7 commits
-
-
Sam Shleifer authored
-
Stas Bekman authored
* implement FlaubertForTokenClassification as a subclass of XLMForTokenClassification * fix mapping order * add the doc * add common tests
-
Patrick von Platen authored
* fix longformer global attention output * fix multi gpu problem * replace -10000 with 0 * better comment * make attention output equal local and global * Update src/transformers/modeling_longformer.py
-
Sylvain Gugger authored
* Fix Trainer in DataParallel setting * Fix typo Co-authored-by:Sam Shleifer <sshleifer@gmail.com>
-
Stas Bekman authored
-
Stas Bekman authored
-
onepointconsulting authored
Added general description, information about the tags and also some example usage code.
-