t5.rst 6.49 KB
Newer Older
Patrick von Platen's avatar
Patrick von Platen committed
1
T5
Sylvain Gugger's avatar
Sylvain Gugger committed
2
3
4
5
-----------------------------------------------------------------------------------------------------------------------

**DISCLAIMER:** This model is still a work in progress, if you see something strange, file a `Github Issue
<https://github.com/huggingface/transformers/issues/new?assignees=&labels=&template=bug-report.md&title>`__.
Patrick von Platen's avatar
Patrick von Platen committed
6
7

Overview
Sylvain Gugger's avatar
Sylvain Gugger committed
8
9
10
11
12
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

The T5 model was presented in `Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
<https://arxiv.org/pdf/1910.10683.pdf>`_ by Colin Raffel, Noam Shazeer, Adam Roberts, Katherine Lee, Sharan Narang,
Michael Matena, Yanqi Zhou, Wei Li, Peter J. Liu.
Sylvain Gugger's avatar
Sylvain Gugger committed
13

Sylvain Gugger's avatar
Sylvain Gugger committed
14
The abstract from the paper is the following:
Patrick von Platen's avatar
Patrick von Platen committed
15

Sylvain Gugger's avatar
Sylvain Gugger committed
16
17
18
19
20
21
22
23
24
*Transfer learning, where a model is first pre-trained on a data-rich task before being fine-tuned on a downstream
task, has emerged as a powerful technique in natural language processing (NLP). The effectiveness of transfer learning
has given rise to a diversity of approaches, methodology, and practice. In this paper, we explore the landscape of
transfer learning techniques for NLP by introducing a unified framework that converts every language problem into a
text-to-text format. Our systematic study compares pre-training objectives, architectures, unlabeled datasets, transfer
approaches, and other factors on dozens of language understanding tasks. By combining the insights from our exploration
with scale and our new "Colossal Clean Crawled Corpus", we achieve state-of-the-art results on many benchmarks covering
summarization, question answering, text classification, and more. To facilitate future work on transfer learning for
NLP, we release our dataset, pre-trained models, and code.*
Patrick von Platen's avatar
Patrick von Platen committed
25

Sylvain Gugger's avatar
Sylvain Gugger committed
26
27
Tips:

Sylvain Gugger's avatar
Sylvain Gugger committed
28
29
30
31
32
33
34
35
36
37
- T5 is an encoder-decoder model pre-trained on a multi-task mixture of unsupervised and supervised tasks and for which
  each task is converted into a text-to-text format. T5 works well on a variety of tasks out-of-the-box by prepending a
  different prefix to the input corresponding to each task, e.g., for translation: *translate English to German: ...*,
  for summarization: *summarize: ...*.
  
  For more information about which prefix to use, it is easiest to look into Appendix D of the `paper
  <https://arxiv.org/pdf/1910.10683.pdf>`__.
- For sequence-to-sequence generation, it is recommended to use :obj:`T5ForConditionalGeneration.generate()``. This
  method takes care of feeding the encoded input via cross-attention layers to the decoder and auto-regressively
  generates the decoder output.
Sylvain Gugger's avatar
Sylvain Gugger committed
38
39
- T5 uses relative scalar embeddings. Encoder input padding can be done on the left and on the right.

Sylvain Gugger's avatar
Sylvain Gugger committed
40
The original code can be found `here <https://github.com/google-research/text-to-text-transfer-transformer>`__.
Patrick von Platen's avatar
Patrick von Platen committed
41

42
Training
Sylvain Gugger's avatar
Sylvain Gugger committed
43
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Sylvain Gugger's avatar
Sylvain Gugger committed
44

Sylvain Gugger's avatar
Sylvain Gugger committed
45
46
47
48
49
50
T5 is an encoder-decoder model and converts all NLP problems into a text-to-text format. It is trained using teacher
forcing. This means that for training we always need an input sequence and a target sequence. The input sequence is fed
to the model using :obj:`input_ids``. The target sequence is shifted to the right, i.e., prepended by a start-sequence
token and fed to the decoder using the :obj:`decoder_input_ids`. In teacher-forcing style, the target sequence is then
appended by the EOS token and corresponds to the :obj:`labels`. The PAD token is hereby used as the start-sequence
token. T5 can be trained / fine-tuned both in a supervised and unsupervised fashion.
51
52

- Unsupervised denoising training
Lorenzo Ampil's avatar
Lorenzo Ampil committed
53

54
55
  In this setup spans of the input sequence are masked by so-called sentinel tokens (*a.k.a* unique mask tokens) 
  and the output sequence is formed as a concatenation of the same sentinel tokens and the *real* masked tokens. 
Sylvain Gugger's avatar
Sylvain Gugger committed
56
57
58
59
60
61
  Each sentinel token represents a unique mask token for this sentence and should start with :obj:`<extra_id_0>`, 
  :obj:`<extra_id_1>`, ... up to :obj:`<extra_id_99>`. As a default, 100 sentinel tokens are available in
  :class:`~transformers.T5Tokenizer`.
  
  For instance, the sentence "The cute dog walks in the park" with the masks put on "cute dog" and "the" should be
  processed as follows: 
62

Sylvain Gugger's avatar
Sylvain Gugger committed
63
.. code-block::
64

65
66
  input_ids = tokenizer.encode('The <extra_id_0> walks in <extra_id_1> park', return_tensors='pt')
  labels = tokenizer.encode('<extra_id_0> cute dog <extra_id_1> the <extra_id_2> </s>', return_tensors='pt')
67
  # the forward function automatically creates the correct decoder_input_ids
Suraj Patil's avatar
Suraj Patil committed
68
  model(input_ids=input_ids, labels=labels)
69
70

- Supervised training
Lorenzo Ampil's avatar
Lorenzo Ampil committed
71

Sylvain Gugger's avatar
Sylvain Gugger committed
72
73
74
  In this setup the input sequence and output sequence are standard sequence-to-sequence input output mapping.
  In translation, for instance with the input sequence "The house is wonderful." and output sequence "Das Haus ist
  wunderbar.", the sentences should be processed as follows:
75
  
Sylvain Gugger's avatar
Sylvain Gugger committed
76
.. code-block::
77
78

  input_ids = tokenizer.encode('translate English to German: The house is wonderful. </s>', return_tensors='pt')
Suraj Patil's avatar
Suraj Patil committed
79
  labels = tokenizer.encode('Das Haus ist wunderbar. </s>', return_tensors='pt')
80
  # the forward function automatically creates the correct decoder_input_ids
Suraj Patil's avatar
Suraj Patil committed
81
  model(input_ids=input_ids, labels=labels)
82

Patrick von Platen's avatar
Patrick von Platen committed
83
84

T5Config
Sylvain Gugger's avatar
Sylvain Gugger committed
85
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Patrick von Platen's avatar
Patrick von Platen committed
86
87
88
89
90
91

.. autoclass:: transformers.T5Config
    :members:


T5Tokenizer
Sylvain Gugger's avatar
Sylvain Gugger committed
92
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Patrick von Platen's avatar
Patrick von Platen committed
93
94
95

.. autoclass:: transformers.T5Tokenizer
    :members: build_inputs_with_special_tokens, get_special_tokens_mask,
Sylvain Gugger's avatar
Sylvain Gugger committed
96
        create_token_type_ids_from_sequences, prepare_seq2seq_batch, save_vocabulary
Patrick von Platen's avatar
Patrick von Platen committed
97
98
99


T5Model
Sylvain Gugger's avatar
Sylvain Gugger committed
100
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Patrick von Platen's avatar
Patrick von Platen committed
101
102

.. autoclass:: transformers.T5Model
Sylvain Gugger's avatar
Sylvain Gugger committed
103
    :members: forward
Patrick von Platen's avatar
Patrick von Platen committed
104
105
106


T5ForConditionalGeneration
Sylvain Gugger's avatar
Sylvain Gugger committed
107
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Patrick von Platen's avatar
Patrick von Platen committed
108
109

.. autoclass:: transformers.T5ForConditionalGeneration
Sylvain Gugger's avatar
Sylvain Gugger committed
110
    :members: forward
Patrick von Platen's avatar
Patrick von Platen committed
111
112
113


TFT5Model
Sylvain Gugger's avatar
Sylvain Gugger committed
114
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Patrick von Platen's avatar
Patrick von Platen committed
115
116

.. autoclass:: transformers.TFT5Model
Sylvain Gugger's avatar
Sylvain Gugger committed
117
    :members: call
Patrick von Platen's avatar
Patrick von Platen committed
118
119
120


TFT5ForConditionalGeneration
Sylvain Gugger's avatar
Sylvain Gugger committed
121
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Patrick von Platen's avatar
Patrick von Platen committed
122
123

.. autoclass:: transformers.TFT5ForConditionalGeneration
Sylvain Gugger's avatar
Sylvain Gugger committed
124
    :members: call