gpt.rst 4.06 KB
Newer Older
1
2
3
OpenAI GPT
----------------------------------------------------

Lysandre's avatar
Lysandre committed
4
5
6
Overview
~~~~~~~~~~~~~~~~~~~~~

Lysandre's avatar
Lysandre committed
7
OpenAI GPT model was proposed in `Improving Language Understanding by Generative Pre-Training <https://s3-us-west-2.amazonaws.com/openai-assets/research-covers/language-unsupervised/language_understanding_paper.pdf>`__
Lysandre's avatar
Lysandre committed
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
by Alec Radford, Karthik Narasimhan, Tim Salimans and Ilya Sutskever. It's a causal (unidirectional)
transformer pre-trained using language modeling on a large corpus will long range dependencies, the Toronto Book Corpus.

The abstract from the paper is the following:

*Natural language understanding comprises a wide range of diverse tasks such
as textual entailment, question answering, semantic similarity assessment, and
document classification. Although large unlabeled text corpora are abundant,
labeled data for learning these specific tasks is scarce, making it challenging for
discriminatively trained models to perform adequately. We demonstrate that large
gains on these tasks can be realized by generative pre-training of a language model
on a diverse corpus of unlabeled text, followed by discriminative fine-tuning on each
specific task. In contrast to previous approaches, we make use of task-aware input
transformations during fine-tuning to achieve effective transfer while requiring
minimal changes to the model architecture. We demonstrate the effectiveness of
our approach on a wide range of benchmarks for natural language understanding.
Our general task-agnostic model outperforms discriminatively trained models that
use architectures specifically crafted for each task, significantly improving upon the
state of the art in 9 out of the 12 tasks studied.*

Tips:

- GPT is a model with absolute position embeddings so it's usually advised to pad the inputs on
  the right rather than the left.
- GPT was trained with a causal language modeling (CLM) objective and is therefore powerful at predicting the next
  token in a sequence. Leveraging this feature allows GPT-2 to generate syntactically coherent text as
  it can be observed in the `run_generation.py` example script.

`Write With Transformer <https://transformer.huggingface.co/doc/gpt>`__ is a webapp created and hosted by
Hugging Face showcasing the generative capabilities of several models. GPT is one of them.

39
40
The original code can be found `here <https://github.com/openai/finetune-transformer-lm>`_.

41
42
43
44
45
46
47
48
49
50
51
Note:

If you want to reproduce the original tokenization process of the `OpenAI GPT` paper, you will need to install 
``ftfy`` and ``SpaCy``::

    pip install spacy ftfy==4.4.3
    python -m spacy download en

If you don't install ``ftfy`` and ``SpaCy``, the :class:`transformers.OpenAIGPTTokenizer` will default to tokenize using 
BERT's :obj:`BasicTokenizer` followed by Byte-Pair Encoding (which should be fine for most usage, don't 
worry).
52

Lysandre's avatar
Lysandre committed
53
OpenAIGPTConfig
54
~~~~~~~~~~~~~~~~~~~~~
55

56
.. autoclass:: transformers.OpenAIGPTConfig
57
    :members:
58
59


Lysandre's avatar
Lysandre committed
60
OpenAIGPTTokenizer
61
~~~~~~~~~~~~~~~~~~~~~~~~~~
62

63
.. autoclass:: transformers.OpenAIGPTTokenizer
Lysandre Debut's avatar
Lysandre Debut committed
64
    :members: save_vocabulary
65
66


67
68
69
70
71
72
73
OpenAIGPTTokenizerFast
~~~~~~~~~~~~~~~~~~~~~~~~~~

.. autoclass:: transformers.OpenAIGPTTokenizerFast
    :members:


74
75
76
77
78
79
OpenAI specific outputs
~~~~~~~~~~~~~~~~~~~~~~~

.. autoclass:: transformers.modeling_openai.OpenAIGPTDoubleHeadsModelOutput
    :members:

Sylvain Gugger's avatar
Sylvain Gugger committed
80
81
82
.. autoclass:: transformers.modeling_tf_openai.TFOpenAIGPTDoubleHeadsModelOutput
    :members:

83

Lysandre's avatar
Lysandre committed
84
OpenAIGPTModel
85
86
~~~~~~~~~~~~~~~~~~~~~~~~~

87
.. autoclass:: transformers.OpenAIGPTModel
88
89
90
    :members:


Lysandre's avatar
Lysandre committed
91
OpenAIGPTLMHeadModel
92
93
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

94
.. autoclass:: transformers.OpenAIGPTLMHeadModel
95
96
97
    :members:


Lysandre's avatar
Lysandre committed
98
OpenAIGPTDoubleHeadsModel
99
100
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

101
.. autoclass:: transformers.OpenAIGPTDoubleHeadsModel
102
    :members:
LysandreJik's avatar
LysandreJik committed
103
104


Lysandre's avatar
Lysandre committed
105
TFOpenAIGPTModel
LysandreJik's avatar
LysandreJik committed
106
107
~~~~~~~~~~~~~~~~~~~~~~~~~

108
.. autoclass:: transformers.TFOpenAIGPTModel
LysandreJik's avatar
LysandreJik committed
109
110
111
    :members:


Lysandre's avatar
Lysandre committed
112
TFOpenAIGPTLMHeadModel
LysandreJik's avatar
LysandreJik committed
113
114
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

115
.. autoclass:: transformers.TFOpenAIGPTLMHeadModel
LysandreJik's avatar
LysandreJik committed
116
117
118
    :members:


Lysandre's avatar
Lysandre committed
119
TFOpenAIGPTDoubleHeadsModel
LysandreJik's avatar
LysandreJik committed
120
121
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

122
.. autoclass:: transformers.TFOpenAIGPTDoubleHeadsModel
LysandreJik's avatar
LysandreJik committed
123
    :members: