roberta.rst 3.19 KB
Newer Older
LysandreJik's avatar
Doc  
LysandreJik committed
1
2
3
RoBERTa
----------------------------------------------------

Lysandre's avatar
Lysandre committed
4
The RoBERTa model was proposed in `RoBERTa: A Robustly Optimized BERT Pretraining Approach <https://arxiv.org/abs/1907.11692>`_
Lysandre's avatar
Lysandre committed
5
6
7
8
9
10
by Yinhan Liu, Myle Ott, Naman Goyal, Jingfei Du, Mandar Joshi, Danqi Chen, Omer Levy, Mike Lewis, Luke Zettlemoyer,
Veselin Stoyanov. It is based on Google's BERT model released in 2018.

It builds on BERT and modifies key hyperparameters, removing the next-sentence pretraining
objective and training with much larger mini-batches and learning rates.

Lysandre's avatar
Lysandre committed
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
The abstract from the paper is the following:

*Language model pretraining has led to significant performance gains but careful comparison between different
approaches is challenging. Training is computationally expensive, often done on private datasets of different sizes,
and, as we will show, hyperparameter choices have significant impact on the final results. We present a replication
study of BERT pretraining (Devlin et al., 2019) that carefully measures the impact of many key hyperparameters and
training data size. We find that BERT was significantly undertrained, and can match or exceed the performance of
every model published after it. Our best model achieves state-of-the-art results on GLUE, RACE and SQuAD. These
results highlight the importance of previously overlooked design choices, and raise questions about the source
of recently reported improvements. We release our models and code.*

Tips:

- This implementation is the same as :class:`~transformers.BertModel` with a tiny embeddings tweak as well as a
  setup for Roberta pretrained models.
Lysandre's avatar
Lysandre committed
26
27
28
- RoBERTa has the same architecture as BERT, but uses a byte-level BPE as a tokenizer (same as GPT-2) and uses a
  different pre-training scheme.
- RoBERTa doesn't have `token_type_ids`, you don't need to indicate which token belongs to which segment. Just separate your segments with the separation token `tokenizer.sep_token` (or `</s>`)
Lysandre's avatar
Lysandre committed
29
- `Camembert <./camembert.html>`__ is a wrapper around RoBERTa. Refer to this page for usage examples.
Lysandre's avatar
Lysandre committed
30
31

RobertaConfig
LysandreJik's avatar
Doc  
LysandreJik committed
32
33
~~~~~~~~~~~~~~~~~~~~~

34
.. autoclass:: transformers.RobertaConfig
LysandreJik's avatar
Doc  
LysandreJik committed
35
36
37
    :members:


Lysandre's avatar
Lysandre committed
38
RobertaTokenizer
LysandreJik's avatar
Doc  
LysandreJik committed
39
40
~~~~~~~~~~~~~~~~~~~~~

41
.. autoclass:: transformers.RobertaTokenizer
LysandreJik's avatar
Doc  
LysandreJik committed
42
43
44
    :members:


Lysandre's avatar
Lysandre committed
45
RobertaModel
LysandreJik's avatar
Doc  
LysandreJik committed
46
47
~~~~~~~~~~~~~~~~~~~~

48
.. autoclass:: transformers.RobertaModel
LysandreJik's avatar
Doc  
LysandreJik committed
49
50
51
    :members:


Lysandre's avatar
Lysandre committed
52
RobertaForMaskedLM
LysandreJik's avatar
Doc  
LysandreJik committed
53
54
~~~~~~~~~~~~~~~~~~~~~~~~~~

55
.. autoclass:: transformers.RobertaForMaskedLM
LysandreJik's avatar
Doc  
LysandreJik committed
56
57
58
    :members:


Lysandre's avatar
Lysandre committed
59
RobertaForSequenceClassification
LysandreJik's avatar
Doc  
LysandreJik committed
60
61
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

62
.. autoclass:: transformers.RobertaForSequenceClassification
LysandreJik's avatar
Doc  
LysandreJik committed
63
    :members:
LysandreJik's avatar
LysandreJik committed
64
65


Lysandre's avatar
Lysandre committed
66
67
68
69
70
71
RobertaForTokenClassification
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

.. autoclass:: transformers.RobertaForTokenClassification
    :members:

Lysandre's avatar
Lysandre committed
72
TFRobertaModel
LysandreJik's avatar
LysandreJik committed
73
74
~~~~~~~~~~~~~~~~~~~~

75
.. autoclass:: transformers.TFRobertaModel
LysandreJik's avatar
LysandreJik committed
76
77
78
    :members:


Lysandre's avatar
Lysandre committed
79
TFRobertaForMaskedLM
LysandreJik's avatar
LysandreJik committed
80
81
~~~~~~~~~~~~~~~~~~~~~~~~~~

82
.. autoclass:: transformers.TFRobertaForMaskedLM
LysandreJik's avatar
LysandreJik committed
83
84
85
    :members:


Lysandre's avatar
Lysandre committed
86
TFRobertaForSequenceClassification
LysandreJik's avatar
LysandreJik committed
87
88
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

89
.. autoclass:: transformers.TFRobertaForSequenceClassification
LysandreJik's avatar
LysandreJik committed
90
    :members:
Lysandre's avatar
Lysandre committed
91
92
93
94
95
96
97


TFRobertaForTokenClassification
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

.. autoclass:: transformers.TFRobertaForTokenClassification
    :members: