distilbert.rst 3.65 KB
Newer Older
LysandreJik's avatar
LysandreJik committed
1
2
3
DistilBERT
----------------------------------------------------

Lysandre's avatar
Lysandre committed
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
The DistilBERT model was proposed in the blog post
`Smaller, faster, cheaper, lighter: Introducing DistilBERT, a distilled version of BERT <https://medium.com/huggingface/distilbert-8cf3380435b5>`__,
and the paper `DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter <https://arxiv.org/abs/1910.01108>`__.
DistilBERT is a small, fast, cheap and light Transformer model trained by distilling Bert base. It has 40% less
parameters than `bert-base-uncased`, runs 60% faster while preserving over 95% of Bert's performances as measured on
the GLUE language understanding benchmark.

The abstract from the paper is the following:

*As Transfer Learning from large-scale pre-trained models becomes more prevalent in Natural Language Processing (NLP),
operating these large models in on-the-edge and/or under constrained computational training or inference budgets
remains challenging. In this work, we propose a method to pre-train a smaller general-purpose language representation
model, called DistilBERT, which can then be fine-tuned with good performances on a wide range of tasks like its larger
counterparts. While most prior work investigated the use of distillation for building task-specific models, we
leverage knowledge distillation during the pre-training phase and show that it is possible to reduce the size of a
BERT model by 40%, while retaining 97% of its language understanding capabilities and being 60% faster. To leverage
the inductive biases learned by larger models during pre-training, we introduce a triple loss combining language
modeling, distillation and cosine-distance losses. Our smaller, faster and lighter model is cheaper to pre-train
and we demonstrate its capabilities for on-device computations in a proof-of-concept experiment and a comparative
on-device study.*

Tips:
Lysandre's avatar
Lysandre committed
26
27
28
29

- DistilBert doesn't have `token_type_ids`, you don't need to indicate which token belongs to which segment. Just separate your segments with the separation token `tokenizer.sep_token` (or `[SEP]`)
- DistilBert doesn't have options to select the input positions (`position_ids` input). This could be added if necessary though, just let's us know if you need this option.

30
31
The original code can be found `here <https://github.com/huggingface/transformers/tree/master/examples/distillation>`_.

Lysandre's avatar
Lysandre committed
32

Lysandre's avatar
Lysandre committed
33
DistilBertConfig
34
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
LysandreJik's avatar
LysandreJik committed
35

36
.. autoclass:: transformers.DistilBertConfig
LysandreJik's avatar
LysandreJik committed
37
38
39
    :members:


Lysandre's avatar
Lysandre committed
40
DistilBertTokenizer
41
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
LysandreJik's avatar
LysandreJik committed
42

43
.. autoclass:: transformers.DistilBertTokenizer
LysandreJik's avatar
LysandreJik committed
44
45
46
    :members:


Lysandre's avatar
Lysandre committed
47
DistilBertModel
48
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
LysandreJik's avatar
LysandreJik committed
49

50
.. autoclass:: transformers.DistilBertModel
LysandreJik's avatar
LysandreJik committed
51
52
53
    :members:


Lysandre's avatar
Lysandre committed
54
DistilBertForMaskedLM
55
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
LysandreJik's avatar
LysandreJik committed
56

57
.. autoclass:: transformers.DistilBertForMaskedLM
LysandreJik's avatar
LysandreJik committed
58
59
60
    :members:


Lysandre's avatar
Lysandre committed
61
DistilBertForSequenceClassification
62
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
LysandreJik's avatar
LysandreJik committed
63

64
.. autoclass:: transformers.DistilBertForSequenceClassification
LysandreJik's avatar
LysandreJik committed
65
66
67
    :members:


Lysandre's avatar
Lysandre committed
68
DistilBertForQuestionAnswering
LysandreJik's avatar
LysandreJik committed
69
70
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

71
.. autoclass:: transformers.DistilBertForQuestionAnswering
LysandreJik's avatar
LysandreJik committed
72
    :members:
LysandreJik's avatar
LysandreJik committed
73

Lysandre's avatar
Lysandre committed
74
TFDistilBertModel
LysandreJik's avatar
LysandreJik committed
75
76
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

77
.. autoclass:: transformers.TFDistilBertModel
LysandreJik's avatar
LysandreJik committed
78
79
80
    :members:


Lysandre's avatar
Lysandre committed
81
TFDistilBertForMaskedLM
LysandreJik's avatar
LysandreJik committed
82
83
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

84
.. autoclass:: transformers.TFDistilBertForMaskedLM
LysandreJik's avatar
LysandreJik committed
85
86
87
    :members:


Lysandre's avatar
Lysandre committed
88
TFDistilBertForSequenceClassification
LysandreJik's avatar
LysandreJik committed
89
90
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

91
.. autoclass:: transformers.TFDistilBertForSequenceClassification
LysandreJik's avatar
LysandreJik committed
92
93
94
    :members:


Lysandre's avatar
Lysandre committed
95
TFDistilBertForQuestionAnswering
LysandreJik's avatar
LysandreJik committed
96
97
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

98
.. autoclass:: transformers.TFDistilBertForQuestionAnswering
LysandreJik's avatar
LysandreJik committed
99
    :members: