- 29 May, 2020 2 commits
-
-
Patrick von Platen authored
* better api * improve automatic setting of global attention mask * fix longformer bug * fix global attention mask in test * fix global attn mask flatten * fix slow tests * update docstring * update docs and make more robust * improve attention mask
-
Patrick von Platen authored
* add multiple choice for longformer * add models to docs * adapt docstring * add test to longformer * add longformer for mc in init and modeling auto * fix tests
-
- 28 May, 2020 1 commit
-
-
Suraj Patil authored
-
- 27 May, 2020 1 commit
-
-
Suraj Patil authored
* LongformerForSequenceClassification * better naming x=>hidden_states, fix typo in doc * Update src/transformers/modeling_longformer.py * Update src/transformers/modeling_longformer.py Co-authored-by:Patrick von Platen <patrick.v.platen@gmail.com>
-
- 25 May, 2020 1 commit
-
-
Suraj Patil authored
* added LongformerForQuestionAnswering * add LongformerForQuestionAnswering * fix import for LongformerForMaskedLM * add LongformerForQuestionAnswering * hardcoded sep_token_id * compute attention_mask if not provided * combine global_attention_mask with attention_mask when provided * update example in docstring * add assert error messages, better attention combine * add test for longformerForQuestionAnswering * typo * cast gloabl_attention_mask to long * make style * Update src/transformers/configuration_longformer.py * Update src/transformers/configuration_longformer.py * fix the code quality * Merge branch 'longformer-for-question-answering' of https://github.com/patil-suraj/transformers into longformer-for-question-answering Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com>
-
- 19 May, 2020 2 commits
-
-
Patrick von Platen authored
* fix gpu slow tests in pytorch * change model to device syntax
-
Iz Beltagy authored
* first commit * bug fixes * better examples * undo padding * remove wrong VOCAB_FILES_NAMES * License * make style * make isort happy * unit tests * integration test * make `black` happy by undoing `isort` changes!! * lint * no need for the padding value * batch_size not bsz * remove unused type casting * seqlen not seq_len * staticmethod * `bert` selfattention instead of `n2` * uint8 instead of bool + lints * pad inputs_embeds using embeddings not a constant * black * unit test with padding * fix unit tests * remove redundant unit test * upload model weights * resolve todo * simpler _mask_invalid_locations without lru_cache + backward compatible masked_fill_ * increase unittest coverage
-