"docs/git@developer.sourcefind.cn:OpenDAS/apex.git" did not exist on "2cbca1a4ecabdc30f36aa7fbc91f9cbd3e4b7fd3"
Commit ea1a410d authored by Myle Ott's avatar Myle Ott Committed by Facebook Github Bot
Browse files

RoBERTa now supported on TPU and TensorFlow via transformers library

Summary: Pull Request resolved: https://github.com/pytorch/fairseq/pull/1197

Differential Revision: D17651374

Pulled By: myleott

fbshipit-source-id: 5feb986de1e682eb83c4479f419ad51325718572
parent 1cb267ed
...@@ -8,6 +8,7 @@ RoBERTa iterates on BERT's pretraining procedure, including training the model l ...@@ -8,6 +8,7 @@ RoBERTa iterates on BERT's pretraining procedure, including training the model l
### What's New: ### What's New:
- September 2019: TensorFlow and TPU support via the [transformers library](https://github.com/huggingface/transformers).
- August 2019: RoBERTa is now supported in the [pytorch-transformers library](https://github.com/huggingface/pytorch-transformers). - August 2019: RoBERTa is now supported in the [pytorch-transformers library](https://github.com/huggingface/pytorch-transformers).
- August 2019: Added [tutorial for finetuning on WinoGrande](https://github.com/pytorch/fairseq/tree/master/examples/roberta/wsc#roberta-training-on-winogrande-dataset). - August 2019: Added [tutorial for finetuning on WinoGrande](https://github.com/pytorch/fairseq/tree/master/examples/roberta/wsc#roberta-training-on-winogrande-dataset).
- August 2019: Added [tutorial for pretraining RoBERTa using your own data](README.pretraining.md). - August 2019: Added [tutorial for pretraining RoBERTa using your own data](README.pretraining.md).
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment