- 15 Jul, 2020 3 commits
-
-
Funtowicz Morgan authored
Signed-off-by:Morgan Funtowicz <morgan@huggingface.co>
-
Sam Shleifer authored
-
Patrick von Platen authored
* fix auto model causal lm * leverage given functionality * apply unused kwargs to all auto models
-
- 14 Jul, 2020 14 commits
-
-
Julien Chaumond authored
-
Bashar Talafha authored
-
Bashar Talafha authored
-
Joe Davison authored
-
Manuel Romero authored
* Customize inference widget input * Update model_cards/mrm8488/RuPERTa-base/README.md Co-authored-by:Kevin Canwen Xu <canwenxu@126.com>
-
dartrevan authored
-
Doron Adler authored
Model card for hewiki-articles-distilGPT2py-il A tiny GPT2 model for generating Hebrew text
-
Pierre Guillou authored
-
Sam Shleifer authored
-
Sam Shleifer authored
-
Boris Dayma authored
* docs(wandb): explain how to use W&B integration fix #5262 * Also mention TensorBoard Co-authored-by:Julien Chaumond <chaumond@gmail.com>
-
Gunnlaugur Thor Briem authored
-
as-stevens authored
[Reformer classification head] Implement the reformer model classification head for text classification (#5198) * Reformer model head classification implementation for text classification * Reformat the reformer model classification code * PR review comments, and test case implementation for reformer for classification head changes * CI/CD reformer for classification head test import error fix * CI/CD test case implementation added ReformerForSequenceClassification to all_model_classes * Code formatting- fixed * Normal test cases added for reformer classification head * Fix test cases implementation for the reformer classification head * removed token_type_id parameter from the reformer classification head * fixed the test case for reformer classification head * merge conflict with master fixed * merge conflict, changed reformer classification to accept the choice_label parameter added in latest code * refactored the the reformer classification head test code * reformer classification head, common transform test cases fixed * final set of the review comment, rearranging the reformer classes and docstring add to classification forward method * fixed the compilation error and text case fix for reformer classification head * Apply suggestions from code review Remove unnecessary dup Co-authored-by:Patrick von Platen <patrick.v.platen@gmail.com>
-
Gaurav Mishra authored
Minor doc fix.
-
- 13 Jul, 2020 7 commits
-
-
Sam Shleifer authored
-
Stas Bekman authored
* implement FlaubertForTokenClassification as a subclass of XLMForTokenClassification * fix mapping order * add the doc * add common tests
-
Patrick von Platen authored
* fix longformer global attention output * fix multi gpu problem * replace -10000 with 0 * better comment * make attention output equal local and global * Update src/transformers/modeling_longformer.py
-
Sylvain Gugger authored
* Fix Trainer in DataParallel setting * Fix typo Co-authored-by:Sam Shleifer <sshleifer@gmail.com>
-
Stas Bekman authored
-
Stas Bekman authored
-
onepointconsulting authored
Added general description, information about the tags and also some example usage code.
-
- 12 Jul, 2020 1 commit
-
-
Kevin Canwen Xu authored
* Add model type check for pipelines * Add model type check for pipelines * rename func * Fix the init parameters * Fix format * rollback unnecessary refactor
-
- 11 Jul, 2020 1 commit
-
-
Kevin Canwen Xu authored
* Add Microsoft's CodeBERT * link style * single modal * unused import
-
- 10 Jul, 2020 14 commits
-
-
Sylvain Gugger authored
* Document model outputs * Update docs/source/main_classes/output.rst Co-authored-by:
Lysandre Debut <lysandre@huggingface.co> Co-authored-by:
Lysandre Debut <lysandre@huggingface.co>
-
Sylvain Gugger authored
-
Tomo Lazovich authored
-
Patrick von Platen authored
-
Julien Chaumond authored
Co-Authored-By:
Suraj Patil <surajp815@gmail.com> Co-Authored-By:
Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
-
Bashar Talafha authored
* Update README.md * Update README.md
-
Manuel Romero authored
-
kolk authored
-
Txus authored
-
Teven authored
Fixed use of memories in XLNet (caching for language generation + warning when loading improper memoryless model) (#5632) * Pytorch gpu => cpu proper device * Memoryless XLNet warning + fixed memories during generation * Revert "Pytorch gpu => cpu proper device" This reverts commit 93489b36 * made black happy * TF generation with memories * dim => axis * added padding_text to TF XL models * Added comment, added TF
-
Manuel Romero authored
-
Manuel Romero authored
Create model card for T5-small fine-tuned on SQUAD v2
-
Nils Reimers authored
Model card for sentence-transformers/bert-base-nli-cls-token
-
Nils Reimers authored
Model card for sentence-transformers/bert-base-nli-max-tokens
-