- 12 May, 2020 6 commits
-
-
Viktor Alm authored
* catch gpu len 1 set to gpu0 * Add mpc to trainer * Add MPC for TF * fix TF automodel for MPC and add Albert * Apply style * Fix import * Note to self: double check * Make shape None, None for datasetgenerator output shapes * Add from_pt bool which doesnt seem to work * Original checkpoint dir * Fix docstrings for automodel * Update readme and apply style * Colab should probably not be from users * Colabs should probably not be from users * Add colab * Update README.md * Update README.md * Cleanup __intit__ * Cleanup flake8 trailing comma * Update src/transformers/training_args_tf.py * Update src/transformers/modeling_tf_auto.py Co-authored-by:
Viktor Alm <viktoralm@pop-os.localdomain> Co-authored-by:
Julien Chaumond <chaumond@gmail.com>
-
Levent Serinol authored
fixed missing torch module import in example usage code
-
Jangwon Park authored
-
Lysandre Debut authored
* pin TF to 2.1 * Pin flake8 as well
-
Julien Chaumond authored
-
Julien Chaumond authored
-
- 11 May, 2020 17 commits
-
-
Lysandre Debut authored
-
Bram Vanroy authored
* simplify cache vars and allow for TRANSFORMERS_CACHE env As it currently stands, "TRANSFORMERS_CACHE" is not an accepted variable. It seems that the these variables were not updated when moving from version pytorch_transformers to transformers. In addition, the fallback procedure could be improved. and simplified. Pathlib seems redundant here. * Update file_utils.py
-
Lysandre Debut authored
-
Tianlei Wu authored
* allow gpt2 to be exported to valid ONNX model * cast size from int to float explictly
-
Guo, Quan authored
"Migrating from pytorch-transformers to transformers" is missing in the main document. It is available in the main `readme` thought. Just move it to the document.
-
Lysandre Debut authored
-
fgaim authored
* Add ALBERT to convert command of transformers-cli * Document ALBERT tf to pytorch model conversion
-
Stefan Schweter authored
* docs: fix link to token classification (NER) example * examples: fix links to NER scripts
-
Funtowicz Morgan authored
-
Sam Shleifer authored
-
Levent Serinol authored
* Create README.md * Update model_cards/lserinol/bert-turkish-question-answering/README.md Co-authored-by:Julien Chaumond <chaumond@gmail.com>
-
Julien Plu authored
* Fix the issue to properly run the accumulator with TF 2.2 * Apply style * Fix training_args_tf for TF 2.2 * Fix the TF training args when only one GPU is available * Remove the fixed version of TF in setup.py
-
Savaş Yıldırım authored
-
Julien Plu authored
-
theblackcat102 authored
-
Patrick von Platen authored
* adapt convert script * update convert script * finish * fix marian pretrained docs
-
Patrick von Platen authored
-
- 10 May, 2020 3 commits
-
-
flozi00 authored
-
Sam Shleifer authored
- MarianSentencepieceTokenizer - > MarianTokenizer - Start using unk token. - add docs page - add better generation params to MarianConfig - more conversion utilities
-
Girishkumar authored
-
- 08 May, 2020 9 commits
-
-
Julien Chaumond authored
* [TPU] Doc, fix xla_spawn.py, only preprocess dataset once * Update examples/README.md * [xla_spawn] Add `_mp_fn` to other Trainer scripts * [TPU] Fix: eval dataloader was None
-
Julien Chaumond authored
-
Lorenzo De Mattei authored
* example updated to use generation pipeline * Update model_cards/LorenzoDeMattei/GePpeTto/README.md Co-authored-by:Julien Chaumond <chaumond@gmail.com>
-
rmroczkowski authored
-
rmroczkowski authored
-
Manuel Romero authored
-
Savaş Yıldırım authored
* Create README.md * Adding code fence around code block
-
Manuel Romero authored
model card for my De Novo Drug discovery model using MLM
-
Patrick von Platen authored
* fix PR * move tests to correct place
-
- 07 May, 2020 5 commits
-
-
Jared T Nielsen authored
* Add AlbertForPreTraining and TFAlbertForPreTraining models. * PyTorch conversion * TensorFlow conversion * style Co-authored-by:Lysandre <lysandre.debut@reseau.eseo.fr>
-
Julien Chaumond authored
-
Savaş Yıldırım authored
-
Julien Chaumond authored
-
Julien Chaumond authored
* README * Update README.md
-