- 01 Jun, 2020 5 commits
-
-
Sylvain Gugger authored
-
Lysandre authored
-
Julien Chaumond authored
-
Julien Chaumond authored
Fixes bug reported in https://github.com/huggingface/transformers/issues/4669 See #3967 for context
-
Rens authored
* pass on tokenizer to pipeline * order input names when convert to onnx * update style * remove unused imports * make ordered inputs list needs to be mutable * add test custom bert model * remove unused imports
-
- 29 May, 2020 7 commits
-
-
Patrick von Platen authored
* fix bug * add more tests
-
Wei Fang authored
* Fix longformer attention mask casting when using apex * remove extra type casting
-
Patrick von Platen authored
* better api * improve automatic setting of global attention mask * fix longformer bug * fix global attention mask in test * fix global attn mask flatten * fix slow tests * update docstring * update docs and make more robust * improve attention mask
-
Simon B枚hm authored
Change the example code to use encode_plus since the token_type_id wasn't being correctly set.
-
Zhangyx authored
-
Patrick von Platen authored
* add multiple choice for longformer * add models to docs * adapt docstring * add test to longformer * add longformer for mc in init and modeling auto * fix tests
-
Iz Beltagy authored
* fix longformer model names in examples * a better name for the notebook
-
- 28 May, 2020 4 commits
-
-
flozi00 authored
* gpt2 typo * Add files via upload
-
Anthony MOI authored
-
Suraj Patil authored
-
Iz Beltagy authored
* adding freeze roberta models * model cards * lint
-
- 27 May, 2020 3 commits
-
-
Patrick von Platen authored
* improve memory benchmarking * correct typo * fix current memory * check torch memory allocated * better pytorch function * add total cached gpu memory * add total gpu required * improve torch gpu usage * update memory usage * finalize memory tracing * save intermediate benchmark class * fix conflict * improve benchmark * improve benchmark * finalize * make style * improve benchmarking * correct typo * make train function more flexible * fix csv save * better repr of bytes * better print * fix __repr__ bug * finish plot script * rename plot file * delete csv and small improvements * fix in plot * fix in plot * correct usage of timeit * remove redundant line * remove redundant line * fix bug * add hf parser tests * add versioning and platform info * make style * add gpu information * ensure backward compatibility * finish adding all tests * Update src/transformers/benchmark/benchmark_args.py Co-authored-by:
Lysandre Debut <lysandre@huggingface.co> * Update src/transformers/benchmark/benchmark_args_utils.py Co-authored-by:
Lysandre Debut <lysandre@huggingface.co> * delete csv files * fix isort ordering * add out of memory handling * add better train memory handling Co-authored-by:
Lysandre Debut <lysandre@huggingface.co>
-
Suraj Patil authored
* LongformerForSequenceClassification * better naming x=>hidden_states, fix typo in doc * Update src/transformers/modeling_longformer.py * Update src/transformers/modeling_longformer.py Co-authored-by:Patrick von Platen <patrick.v.platen@gmail.com>
-
Lysandre Debut authored
* per_device instead of per_gpu/error thrown when argument unknown * [docs] Restore examples.md symlink * Correct absolute links so that symlink to the doc works correctly * Update src/transformers/hf_argparser.py Co-authored-by:
Julien Chaumond <chaumond@gmail.com> * Warning + reorder * Docs * Style * not for squad Co-authored-by:
Julien Chaumond <chaumond@gmail.com>
-
- 26 May, 2020 5 commits
-
-
Patrick von Platen authored
* revert convenience method * clean docs a bit
-
Bram Vanroy authored
* make transformers-cli cross-platform Using "scripts" is a useful option in setup.py particularly when you want to get access to non-python scripts. However, in this case we want to have an entry point into some of our own Python scripts. To do this in a concise, cross-platfom way, we can use entry_points.console_scripts. This change is necessary to provide the CLI on different platforms, which "scripts" does not ensure. Usage remains the same, but the "transformers-cli" script has to be moved (be part of the library) and renamed (underscore + extension) * make style & quality
-
Patrick von Platen authored
* add new longformer for question answering model * add new config as well * fix links * fix links part 2
-
ZhuBaohe authored
* fix * fix1
-
ZhuBaohe authored
-
- 25 May, 2020 6 commits
-
-
Sam Shleifer authored
-
Patrick von Platen authored
* fix reformer num buckets * fix * adapt docs * set num buckets in config
-
Elman Mansimov authored
-
Suraj Patil authored
-
Sho Arora authored
-
Suraj Patil authored
* added LongformerForQuestionAnswering * add LongformerForQuestionAnswering * fix import for LongformerForMaskedLM * add LongformerForQuestionAnswering * hardcoded sep_token_id * compute attention_mask if not provided * combine global_attention_mask with attention_mask when provided * update example in docstring * add assert error messages, better attention combine * add test for longformerForQuestionAnswering * typo * cast gloabl_attention_mask to long * make style * Update src/transformers/configuration_longformer.py * Update src/transformers/configuration_longformer.py * fix the code quality * Merge branch 'longformer-for-question-answering' of https://github.com/patil-suraj/transformers into longformer-for-question-answering Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com>
-
- 23 May, 2020 1 commit
-
-
Bharat Raghunathan authored
-
- 22 May, 2020 9 commits
-
-
Bijay Gurung authored
* Add Type Hints to modeling_utils.py Closes #3911 Add Type Hints to methods in `modeling_utils.py` Note: The coverage isn't 100%. Mostly skipped internal methods. * Reformat according to `black` and `isort` * Use typing.Iterable instead of Sequence * Parameterize Iterable by its generic type * Use typing.Optional when None is the default value * Adhere to style guideline * Update src/transformers/modeling_utils.py * Update src/transformers/modeling_utils.py Co-authored-by:Julien Chaumond <chaumond@gmail.com>
-
Funtowicz Morgan authored
* Warn the user about max_len being on the path to be deprecated. * Ensure better backward compatibility when max_len is provided to a tokenizer. * Make sure to override the parameter and not the actual instance value. * Format & quality
-
Sam Shleifer authored
* Fix pipelines defaults bug * one liner * style
-
Julien Chaumond authored
As discussed w/ @lysandrejik packaging is maintained by PyPA (the Python Packaging Authority), and should be lightweight and stable
-
Lysandre authored
-
Anthony MOI authored
-
Lysandre authored
-
Lysandre authored
-
Frankie Liuzzi authored
* added functionality for electra classification head * unneeded dropout * Test ELECTRA for sequence classification * Style Co-authored-by:
Frankie <frankie@frase.io> Co-authored-by:
Lysandre <lysandre.debut@reseau.eseo.fr>
-