"test/srt/models/test_nvidia_nemotron_nano_v2.py" did not exist on "ae7ee01a8e59f755d47426c4b08641053b765a89"
roberta-large-mnli-README.md 583 Bytes
Newer Older
1
2
3
---
license: mit
widget:
4
- text: "I like you. </s></s> I love you."
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
---


## roberta-large-mnli

Trained by Facebook, [original source](https://github.com/pytorch/fairseq/tree/master/examples/roberta)

```bibtex
@article{liu2019roberta,
    title = {RoBERTa: A Robustly Optimized BERT Pretraining Approach},
    author = {Yinhan Liu and Myle Ott and Naman Goyal and Jingfei Du and
              Mandar Joshi and Danqi Chen and Omer Levy and Mike Lewis and
              Luke Zettlemoyer and Veselin Stoyanov},
    journal={arXiv preprint arXiv:1907.11692},
    year = {2019},
}
```