longformer.rst 9.29 KB
Newer Older
1
Longformer
Sylvain Gugger's avatar
Sylvain Gugger committed
2
3
4
5
-----------------------------------------------------------------------------------------------------------------------

**DISCLAIMER:** This model is still a work in progress, if you see something strange, file a `Github Issue
<https://github.com/huggingface/transformers/issues/new?assignees=&labels=&template=bug-report.md&title>`__.
6
7

Overview
Sylvain Gugger's avatar
Sylvain Gugger committed
8
9
10
11
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

The Longformer model was presented in `Longformer: The Long-Document Transformer
<https://arxiv.org/pdf/2004.05150.pdf>`__ by Iz Beltagy, Matthew E. Peters, Arman Cohan.
12

Sylvain Gugger's avatar
Sylvain Gugger committed
13
The abstract from the paper is the following:
14

Sylvain Gugger's avatar
Sylvain Gugger committed
15
16
17
18
19
20
21
22
23
24
25
*Transformer-based models are unable to process long sequences due to their self-attention operation, which scales
quadratically with the sequence length. To address this limitation, we introduce the Longformer with an attention
mechanism that scales linearly with sequence length, making it easy to process documents of thousands of tokens or
longer. Longformer's attention mechanism is a drop-in replacement for the standard self-attention and combines a local
windowed attention with a task motivated global attention. Following prior work on long-sequence transformers, we
evaluate Longformer on character-level language modeling and achieve state-of-the-art results on text8 and enwik8. In
contrast to most prior work, we also pretrain Longformer and finetune it on a variety of downstream tasks. Our
pretrained Longformer consistently outperforms RoBERTa on long document tasks and sets new state-of-the-art results on
WikiHop and TriviaQA.*

The Authors' code can be found `here <https://github.com/allenai/longformer>`__.
26
27

Longformer Self Attention
Sylvain Gugger's avatar
Sylvain Gugger committed
28
29
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Sylvain Gugger's avatar
Sylvain Gugger committed
30
31
32
Longformer self attention employs self attention on both a "local" context and a "global" context. Most tokens only
attend "locally" to each other meaning that each token attends to its :math:`\frac{1}{2} w` previous tokens and
:math:`\frac{1}{2} w` succeding tokens with :math:`w` being the window length as defined in
Sylvain Gugger's avatar
Sylvain Gugger committed
33
34
35
:obj:`config.attention_window`. Note that :obj:`config.attention_window` can be of type :obj:`List` to define a
different :math:`w` for each layer. A selected few tokens attend "globally" to all other tokens, as it is
conventionally done for all tokens in :obj:`BertSelfAttention`.
36

Sylvain Gugger's avatar
Sylvain Gugger committed
37
38
39
Note that "locally" and "globally" attending tokens are projected by different query, key and value matrices. Also note
that every "locally" attending token not only attends to tokens within its window :math:`w`, but also to all "globally"
attending tokens so that global attention is *symmetric*.
40

Sylvain Gugger's avatar
Sylvain Gugger committed
41
42
The user can define which tokens attend "locally" and which tokens attend "globally" by setting the tensor
:obj:`global_attention_mask` at run-time appropriately. All Longformer models employ the following logic for
Sylvain Gugger's avatar
Sylvain Gugger committed
43
:obj:`global_attention_mask`:
44

Sylvain Gugger's avatar
Sylvain Gugger committed
45
46
- 0: the token attends "locally",
- 1: the token attends "globally".
47

Sylvain Gugger's avatar
Sylvain Gugger committed
48
49
50
51
52
53
54
55
56
For more information please also refer to :meth:`~transformers.LongformerModel.forward` method.

Using Longformer self attention, the memory and time complexity of the query-key matmul operation, which usually
represents the memory and time bottleneck, can be reduced from :math:`\mathcal{O}(n_s \times n_s)` to
:math:`\mathcal{O}(n_s \times w)`, with :math:`n_s` being the sequence length and :math:`w` being the average window
size. It is assumed that the number of "globally" attending tokens is insignificant as compared to the number of
"locally" attending tokens.

For more information, please refer to the official `paper <https://arxiv.org/pdf/2004.05150.pdf>`__.
57
58
59


Training
Sylvain Gugger's avatar
Sylvain Gugger committed
60
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
61

Sylvain Gugger's avatar
Sylvain Gugger committed
62
63
:class:`~transformers.LongformerForMaskedLM` is trained the exact same way :class:`~transformers.RobertaForMaskedLM` is
trained and should be used as follows:
64

Sylvain Gugger's avatar
Sylvain Gugger committed
65
.. code-block::
66

Sylvain Gugger's avatar
Sylvain Gugger committed
67
68
69
70
    input_ids = tokenizer.encode('This is a sentence from [MASK] training data', return_tensors='pt')
    mlm_labels = tokenizer.encode('This is a sentence from the training data', return_tensors='pt')

    loss = model(input_ids, labels=input_ids, masked_lm_labels=mlm_labels)[0]
71
72
73


LongformerConfig
Sylvain Gugger's avatar
Sylvain Gugger committed
74
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
75
76
77
78
79
80

.. autoclass:: transformers.LongformerConfig
    :members:


LongformerTokenizer
Sylvain Gugger's avatar
Sylvain Gugger committed
81
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
82
83
84
85
86

.. autoclass:: transformers.LongformerTokenizer
    :members: 


Sylvain Gugger's avatar
Sylvain Gugger committed
87
LongformerTokenizerFast
Sylvain Gugger's avatar
Sylvain Gugger committed
88
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Sylvain Gugger's avatar
Sylvain Gugger committed
89
90
91
92

.. autoclass:: transformers.LongformerTokenizerFast
    :members: 

93
94
95
Longformer specific outputs
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Sylvain Gugger's avatar
Sylvain Gugger committed
96
.. autoclass:: transformers.models.longformer.modeling_longformer.LongformerBaseModelOutput
97
98
    :members: 

Sylvain Gugger's avatar
Sylvain Gugger committed
99
.. autoclass:: transformers.models.longformer.modeling_longformer.LongformerBaseModelOutputWithPooling
100
101
    :members: 

102
.. autoclass:: transformers.models.longformer.modeling_longformer.LongformerMaskedLMOutput
103
104
    :members: 

Sylvain Gugger's avatar
Sylvain Gugger committed
105
.. autoclass:: transformers.models.longformer.modeling_longformer.LongformerQuestionAnsweringModelOutput
106
107
    :members: 

108
109
110
111
112
113
114
115
116
.. autoclass:: transformers.models.longformer.modeling_longformer.LongformerSequenceClassifierOutput
    :members: 

.. autoclass:: transformers.models.longformer.modeling_longformer.LongformerMultipleChoiceModelOutput
    :members: 

.. autoclass:: transformers.models.longformer.modeling_longformer.LongformerTokenClassifierOutput
    :members: 

Sylvain Gugger's avatar
Sylvain Gugger committed
117
.. autoclass:: transformers.models.longformer.modeling_tf_longformer.TFLongformerBaseModelOutput
118
119
    :members: 

Sylvain Gugger's avatar
Sylvain Gugger committed
120
.. autoclass:: transformers.models.longformer.modeling_tf_longformer.TFLongformerBaseModelOutputWithPooling
121
122
    :members: 

123
124
125
.. autoclass:: transformers.models.longformer.modeling_tf_longformer.TFLongformerMaskedLMOutput
    :members: 

Sylvain Gugger's avatar
Sylvain Gugger committed
126
.. autoclass:: transformers.models.longformer.modeling_tf_longformer.TFLongformerQuestionAnsweringModelOutput
127
128
    :members: 

129
130
131
132
133
134
135
136
.. autoclass:: transformers.models.longformer.modeling_tf_longformer.TFLongformerSequenceClassifierOutput
    :members: 

.. autoclass:: transformers.models.longformer.modeling_tf_longformer.TFLongformerMultipleChoiceModelOutput
    :members: 

.. autoclass:: transformers.models.longformer.modeling_tf_longformer.TFLongformerTokenClassifierOutput
    :members: 
Sylvain Gugger's avatar
Sylvain Gugger committed
137

138
LongformerModel
Sylvain Gugger's avatar
Sylvain Gugger committed
139
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
140
141

.. autoclass:: transformers.LongformerModel
Sylvain Gugger's avatar
Sylvain Gugger committed
142
    :members: forward
143
144
145


LongformerForMaskedLM
Sylvain Gugger's avatar
Sylvain Gugger committed
146
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
147
148

.. autoclass:: transformers.LongformerForMaskedLM
Sylvain Gugger's avatar
Sylvain Gugger committed
149
    :members: forward
150
151


Sylvain Gugger's avatar
Sylvain Gugger committed
152
LongformerForSequenceClassification
Sylvain Gugger's avatar
Sylvain Gugger committed
153
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
154

Sylvain Gugger's avatar
Sylvain Gugger committed
155
.. autoclass:: transformers.LongformerForSequenceClassification
Sylvain Gugger's avatar
Sylvain Gugger committed
156
    :members: forward
157
158
159


LongformerForMultipleChoice
Sylvain Gugger's avatar
Sylvain Gugger committed
160
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
161
162

.. autoclass:: transformers.LongformerForMultipleChoice
Sylvain Gugger's avatar
Sylvain Gugger committed
163
    :members: forward
164
165
166


LongformerForTokenClassification
Sylvain Gugger's avatar
Sylvain Gugger committed
167
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
168
169

.. autoclass:: transformers.LongformerForTokenClassification
Sylvain Gugger's avatar
Sylvain Gugger committed
170
    :members: forward
171

Sylvain Gugger's avatar
Sylvain Gugger committed
172
173

LongformerForQuestionAnswering
Sylvain Gugger's avatar
Sylvain Gugger committed
174
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Sylvain Gugger's avatar
Sylvain Gugger committed
175
176

.. autoclass:: transformers.LongformerForQuestionAnswering
Sylvain Gugger's avatar
Sylvain Gugger committed
177
    :members: forward
Patrick von Platen's avatar
Patrick von Platen committed
178
179
180


TFLongformerModel
Sylvain Gugger's avatar
Sylvain Gugger committed
181
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Patrick von Platen's avatar
Patrick von Platen committed
182
183

.. autoclass:: transformers.TFLongformerModel
Sylvain Gugger's avatar
Sylvain Gugger committed
184
    :members: call
Patrick von Platen's avatar
Patrick von Platen committed
185
186
187


TFLongformerForMaskedLM
Sylvain Gugger's avatar
Sylvain Gugger committed
188
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Patrick von Platen's avatar
Patrick von Platen committed
189
190

.. autoclass:: transformers.TFLongformerForMaskedLM
Sylvain Gugger's avatar
Sylvain Gugger committed
191
    :members: call
Patrick von Platen's avatar
Patrick von Platen committed
192
193
194


TFLongformerForQuestionAnswering
Sylvain Gugger's avatar
Sylvain Gugger committed
195
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Patrick von Platen's avatar
Patrick von Platen committed
196
197

.. autoclass:: transformers.TFLongformerForQuestionAnswering
Sylvain Gugger's avatar
Sylvain Gugger committed
198
    :members: call
Patrick von Platen's avatar
Patrick von Platen committed
199

200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220

TFLongformerForSequenceClassification
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

.. autoclass:: transformers.TFLongformerForSequenceClassification
    :members: call


TFLongformerForTokenClassification
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

.. autoclass:: transformers.TFLongformerForTokenClassification
    :members: call


TFLongformerForMultipleChoice
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

.. autoclass:: transformers.TFLongformerForMultipleChoice
    :members: call