gpt2.rst 3.14 KB
Newer Older
1
2
3
OpenAI GPT2
----------------------------------------------------

4
5
6
7
Overview
~~~~~~~~~~~~~~~~~~~~~

OpenAI GPT-2 model was proposed in
Lysandre Debut's avatar
Lysandre Debut committed
8
`Language Models are Unsupervised Multitask Learners <https://cdn.openai.com/better-language-models/language_models_are_unsupervised_multitask_learners.pdf>`_
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
by Alec Radford*, Jeffrey Wu*, Rewon Child, David Luan, Dario Amodei** and Ilya Sutskever**.
It's a causal (unidirectional) transformer pre-trained using  language modeling on a very large
corpus of ~40 GB of text data.

The abstract from the paper is the following:

*GPT-2 is a large transformer-based language model with 1.5 billion parameters, trained on a dataset[1]
of 8 million web pages. GPT-2 is trained with a simple objective: predict the next word, given all of the previous
words within some text. The diversity of the dataset causes this simple goal to contain naturally occurring
demonstrations of many tasks across diverse domains. GPT-2 is a direct scale-up of GPT, with more than 10X
the parameters and trained on more than 10X the amount of data.*

Tips:

- GPT-2 is a model with absolute position embeddings so it's usually advised to pad the inputs on
  the right rather than the left.
- GPT-2 was trained with a causal language modeling (CLM) objective and is therefore powerful at predicting the next
  token in a sequence. Leveraging this feature allows GPT-2 to generate syntactically coherent text as
  it can be observed in the `run_generation.py` example script.
- The PyTorch models can take the `past` as input, which is the previously computed key/value attention pairs. Using
  this `past` value prevents the model from re-computing pre-computed values in the context of text generation.
  See `reusing the past in generative models <../quickstart.html#using-the-past>`_ for more information on the usage
  of this argument.

Lysandre's avatar
TF GPT2  
Lysandre committed
33
34
35
36
`Write With Transformer <https://transformer.huggingface.co/doc/gpt2-large>`__ is a webapp created and hosted by
Hugging Face showcasing the generative capabilities of several models. GPT-2 is one of them and is available in five
different sizes: small, medium, large, xl and a distilled version of the small checkpoint: distilgpt-2.

37
38
The original code can be found `here <https://openai.com/blog/better-language-models/>`_.

39

Lysandre's avatar
Lysandre committed
40
GPT2Config
41
42
~~~~~~~~~~~~~~~~~~~~~

43
.. autoclass:: transformers.GPT2Config
44
    :members:
45
46


Lysandre's avatar
Lysandre committed
47
GPT2Tokenizer
48
~~~~~~~~~~~~~~~~~~~~~
49

50
.. autoclass:: transformers.GPT2Tokenizer
Lysandre Debut's avatar
Lysandre Debut committed
51
    :members: save_vocabulary
52
53


54
55
56
57
58
59
60
GPT2TokenizerFast
~~~~~~~~~~~~~~~~~~~~~

.. autoclass:: transformers.GPT2TokenizerFast
    :members:


Lysandre's avatar
Lysandre committed
61
GPT2Model
62
63
~~~~~~~~~~~~~~~~~~~~~

64
.. autoclass:: transformers.GPT2Model
65
66
67
    :members:


Lysandre's avatar
Lysandre committed
68
GPT2LMHeadModel
69
70
~~~~~~~~~~~~~~~~~~~~~~~~~~~

71
.. autoclass:: transformers.GPT2LMHeadModel
72
73
74
    :members:


Lysandre's avatar
Lysandre committed
75
GPT2DoubleHeadsModel
76
77
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

78
.. autoclass:: transformers.GPT2DoubleHeadsModel
79
    :members:
LysandreJik's avatar
LysandreJik committed
80
81


Lysandre's avatar
Lysandre committed
82
TFGPT2Model
LysandreJik's avatar
LysandreJik committed
83
84
~~~~~~~~~~~~~~~~~~~~~

85
.. autoclass:: transformers.TFGPT2Model
LysandreJik's avatar
LysandreJik committed
86
87
88
    :members:


Lysandre's avatar
Lysandre committed
89
TFGPT2LMHeadModel
LysandreJik's avatar
LysandreJik committed
90
91
~~~~~~~~~~~~~~~~~~~~~~~~~~~

92
.. autoclass:: transformers.TFGPT2LMHeadModel
LysandreJik's avatar
LysandreJik committed
93
94
95
    :members:


Lysandre's avatar
Lysandre committed
96
TFGPT2DoubleHeadsModel
LysandreJik's avatar
LysandreJik committed
97
98
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

99
.. autoclass:: transformers.TFGPT2DoubleHeadsModel
LysandreJik's avatar
LysandreJik committed
100
    :members: