roberta.rst 3.31 KB
Newer Older
LysandreJik's avatar
Doc  
LysandreJik committed
1
2
3
RoBERTa
----------------------------------------------------

Lysandre's avatar
Lysandre committed
4
The RoBERTa model was proposed in `RoBERTa: A Robustly Optimized BERT Pretraining Approach <https://arxiv.org/abs/1907.11692>`_
Lysandre's avatar
Lysandre committed
5
6
7
8
9
10
by Yinhan Liu, Myle Ott, Naman Goyal, Jingfei Du, Mandar Joshi, Danqi Chen, Omer Levy, Mike Lewis, Luke Zettlemoyer,
Veselin Stoyanov. It is based on Google's BERT model released in 2018.

It builds on BERT and modifies key hyperparameters, removing the next-sentence pretraining
objective and training with much larger mini-batches and learning rates.

Lysandre's avatar
Lysandre committed
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
The abstract from the paper is the following:

*Language model pretraining has led to significant performance gains but careful comparison between different
approaches is challenging. Training is computationally expensive, often done on private datasets of different sizes,
and, as we will show, hyperparameter choices have significant impact on the final results. We present a replication
study of BERT pretraining (Devlin et al., 2019) that carefully measures the impact of many key hyperparameters and
training data size. We find that BERT was significantly undertrained, and can match or exceed the performance of
every model published after it. Our best model achieves state-of-the-art results on GLUE, RACE and SQuAD. These
results highlight the importance of previously overlooked design choices, and raise questions about the source
of recently reported improvements. We release our models and code.*

Tips:

- This implementation is the same as :class:`~transformers.BertModel` with a tiny embeddings tweak as well as a
  setup for Roberta pretrained models.
Lysandre's avatar
Lysandre committed
26
27
28
- RoBERTa has the same architecture as BERT, but uses a byte-level BPE as a tokenizer (same as GPT-2) and uses a
  different pre-training scheme.
- RoBERTa doesn't have `token_type_ids`, you don't need to indicate which token belongs to which segment. Just separate your segments with the separation token `tokenizer.sep_token` (or `</s>`)
Lysandre's avatar
Lysandre committed
29
- `Camembert <./camembert.html>`__ is a wrapper around RoBERTa. Refer to this page for usage examples.
Lysandre's avatar
Lysandre committed
30
31

RobertaConfig
LysandreJik's avatar
Doc  
LysandreJik committed
32
33
~~~~~~~~~~~~~~~~~~~~~

34
.. autoclass:: transformers.RobertaConfig
LysandreJik's avatar
Doc  
LysandreJik committed
35
36
37
    :members:


Lysandre's avatar
Lysandre committed
38
RobertaTokenizer
LysandreJik's avatar
Doc  
LysandreJik committed
39
40
~~~~~~~~~~~~~~~~~~~~~

41
.. autoclass:: transformers.RobertaTokenizer
Lysandre Debut's avatar
Lysandre Debut committed
42
43
    :members: build_inputs_with_special_tokens, get_special_tokens_mask,
        create_token_type_ids_from_sequences, save_vocabulary
LysandreJik's avatar
Doc  
LysandreJik committed
44
45


Lysandre's avatar
Lysandre committed
46
RobertaModel
LysandreJik's avatar
Doc  
LysandreJik committed
47
48
~~~~~~~~~~~~~~~~~~~~

49
.. autoclass:: transformers.RobertaModel
LysandreJik's avatar
Doc  
LysandreJik committed
50
51
52
    :members:


Lysandre's avatar
Lysandre committed
53
RobertaForMaskedLM
LysandreJik's avatar
Doc  
LysandreJik committed
54
55
~~~~~~~~~~~~~~~~~~~~~~~~~~

56
.. autoclass:: transformers.RobertaForMaskedLM
LysandreJik's avatar
Doc  
LysandreJik committed
57
58
59
    :members:


Lysandre's avatar
Lysandre committed
60
RobertaForSequenceClassification
LysandreJik's avatar
Doc  
LysandreJik committed
61
62
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

63
.. autoclass:: transformers.RobertaForSequenceClassification
LysandreJik's avatar
Doc  
LysandreJik committed
64
    :members:
LysandreJik's avatar
LysandreJik committed
65
66


Lysandre's avatar
Lysandre committed
67
68
69
70
71
72
RobertaForTokenClassification
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

.. autoclass:: transformers.RobertaForTokenClassification
    :members:

Lysandre's avatar
Lysandre committed
73
TFRobertaModel
LysandreJik's avatar
LysandreJik committed
74
75
~~~~~~~~~~~~~~~~~~~~

76
.. autoclass:: transformers.TFRobertaModel
LysandreJik's avatar
LysandreJik committed
77
78
79
    :members:


Lysandre's avatar
Lysandre committed
80
TFRobertaForMaskedLM
LysandreJik's avatar
LysandreJik committed
81
82
~~~~~~~~~~~~~~~~~~~~~~~~~~

83
.. autoclass:: transformers.TFRobertaForMaskedLM
LysandreJik's avatar
LysandreJik committed
84
85
86
    :members:


Lysandre's avatar
Lysandre committed
87
TFRobertaForSequenceClassification
LysandreJik's avatar
LysandreJik committed
88
89
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

90
.. autoclass:: transformers.TFRobertaForSequenceClassification
LysandreJik's avatar
LysandreJik committed
91
    :members:
Lysandre's avatar
Lysandre committed
92
93
94
95
96
97
98


TFRobertaForTokenClassification
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

.. autoclass:: transformers.TFRobertaForTokenClassification
    :members: