distilbert.rst 4.43 KB
Newer Older
LysandreJik's avatar
LysandreJik committed
1
2
3
DistilBERT
----------------------------------------------------

Sylvain Gugger's avatar
Sylvain Gugger committed
4
5
6
Overview
~~~~~~~~~~~~~~~~~~~~~

Lysandre's avatar
Lysandre committed
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
The DistilBERT model was proposed in the blog post
`Smaller, faster, cheaper, lighter: Introducing DistilBERT, a distilled version of BERT <https://medium.com/huggingface/distilbert-8cf3380435b5>`__,
and the paper `DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter <https://arxiv.org/abs/1910.01108>`__.
DistilBERT is a small, fast, cheap and light Transformer model trained by distilling Bert base. It has 40% less
parameters than `bert-base-uncased`, runs 60% faster while preserving over 95% of Bert's performances as measured on
the GLUE language understanding benchmark.

The abstract from the paper is the following:

*As Transfer Learning from large-scale pre-trained models becomes more prevalent in Natural Language Processing (NLP),
operating these large models in on-the-edge and/or under constrained computational training or inference budgets
remains challenging. In this work, we propose a method to pre-train a smaller general-purpose language representation
model, called DistilBERT, which can then be fine-tuned with good performances on a wide range of tasks like its larger
counterparts. While most prior work investigated the use of distillation for building task-specific models, we
leverage knowledge distillation during the pre-training phase and show that it is possible to reduce the size of a
BERT model by 40%, while retaining 97% of its language understanding capabilities and being 60% faster. To leverage
the inductive biases learned by larger models during pre-training, we introduce a triple loss combining language
modeling, distillation and cosine-distance losses. Our smaller, faster and lighter model is cheaper to pre-train
and we demonstrate its capabilities for on-device computations in a proof-of-concept experiment and a comparative
on-device study.*

Tips:
Lysandre's avatar
Lysandre committed
29
30
31
32

- DistilBert doesn't have `token_type_ids`, you don't need to indicate which token belongs to which segment. Just separate your segments with the separation token `tokenizer.sep_token` (or `[SEP]`)
- DistilBert doesn't have options to select the input positions (`position_ids` input). This could be added if necessary though, just let's us know if you need this option.

33
34
The original code can be found `here <https://github.com/huggingface/transformers/tree/master/examples/distillation>`_.

Lysandre's avatar
Lysandre committed
35

Lysandre's avatar
Lysandre committed
36
DistilBertConfig
37
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
LysandreJik's avatar
LysandreJik committed
38

39
.. autoclass:: transformers.DistilBertConfig
LysandreJik's avatar
LysandreJik committed
40
41
42
    :members:


Lysandre's avatar
Lysandre committed
43
DistilBertTokenizer
44
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
LysandreJik's avatar
LysandreJik committed
45

46
.. autoclass:: transformers.DistilBertTokenizer
LysandreJik's avatar
LysandreJik committed
47
48
49
    :members:


50
51
52
53
54
55
56
DistilBertTokenizerFast
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

.. autoclass:: transformers.DistilBertTokenizerFast
    :members:


Lysandre's avatar
Lysandre committed
57
DistilBertModel
58
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
LysandreJik's avatar
LysandreJik committed
59

60
.. autoclass:: transformers.DistilBertModel
LysandreJik's avatar
LysandreJik committed
61
62
63
    :members:


Lysandre's avatar
Lysandre committed
64
DistilBertForMaskedLM
65
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
LysandreJik's avatar
LysandreJik committed
66

67
.. autoclass:: transformers.DistilBertForMaskedLM
LysandreJik's avatar
LysandreJik committed
68
69
70
    :members:


Lysandre's avatar
Lysandre committed
71
DistilBertForSequenceClassification
72
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
LysandreJik's avatar
LysandreJik committed
73

74
.. autoclass:: transformers.DistilBertForSequenceClassification
LysandreJik's avatar
LysandreJik committed
75
76
77
    :members:


78
79
80
81
82
83
84
DistilBertForMultipleChoice
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

.. autoclass:: transformers.DistilBertForMultipleChoice
    :members:


Sylvain Gugger's avatar
Sylvain Gugger committed
85
86
87
88
89
90
91
DistilBertForTokenClassification
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

.. autoclass:: transformers.DistilBertForTokenClassification
    :members:


Lysandre's avatar
Lysandre committed
92
DistilBertForQuestionAnswering
LysandreJik's avatar
LysandreJik committed
93
94
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

95
.. autoclass:: transformers.DistilBertForQuestionAnswering
LysandreJik's avatar
LysandreJik committed
96
    :members:
LysandreJik's avatar
LysandreJik committed
97

Lysandre's avatar
Lysandre committed
98
TFDistilBertModel
LysandreJik's avatar
LysandreJik committed
99
100
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

101
.. autoclass:: transformers.TFDistilBertModel
LysandreJik's avatar
LysandreJik committed
102
103
104
    :members:


Lysandre's avatar
Lysandre committed
105
TFDistilBertForMaskedLM
LysandreJik's avatar
LysandreJik committed
106
107
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

108
.. autoclass:: transformers.TFDistilBertForMaskedLM
LysandreJik's avatar
LysandreJik committed
109
110
111
    :members:


Lysandre's avatar
Lysandre committed
112
TFDistilBertForSequenceClassification
LysandreJik's avatar
LysandreJik committed
113
114
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

115
.. autoclass:: transformers.TFDistilBertForSequenceClassification
LysandreJik's avatar
LysandreJik committed
116
117
118
    :members:


Sylvain Gugger's avatar
Sylvain Gugger committed
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134

TFDistilBertForMultipleChoice
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

.. autoclass:: transformers.TFDistilBertForMultipleChoice
    :members:



TFDistilBertForTokenClassification
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

.. autoclass:: transformers.TFDistilBertForTokenClassification
    :members:


Lysandre's avatar
Lysandre committed
135
TFDistilBertForQuestionAnswering
LysandreJik's avatar
LysandreJik committed
136
137
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

138
.. autoclass:: transformers.TFDistilBertForQuestionAnswering
LysandreJik's avatar
LysandreJik committed
139
    :members: