"...git@developer.sourcefind.cn:modelzoo/yolo11_pytorch.git" did not exist on "a74dc9a0390d8903281065c5a1a578c44ca0cb68"
gpt.rst 5.4 KB
Newer Older
1
OpenAI GPT
Sylvain Gugger's avatar
Sylvain Gugger committed
2
-----------------------------------------------------------------------------------------------------------------------
3

Lysandre's avatar
Lysandre committed
4
Overview
Sylvain Gugger's avatar
Sylvain Gugger committed
5
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Lysandre's avatar
Lysandre committed
6

Sylvain Gugger's avatar
Sylvain Gugger committed
7
8
OpenAI GPT model was proposed in `Improving Language Understanding by Generative Pre-Training
<https://s3-us-west-2.amazonaws.com/openai-assets/research-covers/language-unsupervised/language_understanding_paper.pdf>`__
Lysandre's avatar
Lysandre committed
9
by Alec Radford, Karthik Narasimhan, Tim Salimans and Ilya Sutskever. It's a causal (unidirectional)
Sylvain Gugger's avatar
Sylvain Gugger committed
10
11
transformer pre-trained using language modeling on a large corpus will long range dependencies, the Toronto Book
Corpus.
Lysandre's avatar
Lysandre committed
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40

The abstract from the paper is the following:

*Natural language understanding comprises a wide range of diverse tasks such
as textual entailment, question answering, semantic similarity assessment, and
document classification. Although large unlabeled text corpora are abundant,
labeled data for learning these specific tasks is scarce, making it challenging for
discriminatively trained models to perform adequately. We demonstrate that large
gains on these tasks can be realized by generative pre-training of a language model
on a diverse corpus of unlabeled text, followed by discriminative fine-tuning on each
specific task. In contrast to previous approaches, we make use of task-aware input
transformations during fine-tuning to achieve effective transfer while requiring
minimal changes to the model architecture. We demonstrate the effectiveness of
our approach on a wide range of benchmarks for natural language understanding.
Our general task-agnostic model outperforms discriminatively trained models that
use architectures specifically crafted for each task, significantly improving upon the
state of the art in 9 out of the 12 tasks studied.*

Tips:

- GPT is a model with absolute position embeddings so it's usually advised to pad the inputs on
  the right rather than the left.
- GPT was trained with a causal language modeling (CLM) objective and is therefore powerful at predicting the next
  token in a sequence. Leveraging this feature allows GPT-2 to generate syntactically coherent text as
  it can be observed in the `run_generation.py` example script.

`Write With Transformer <https://transformer.huggingface.co/doc/gpt>`__ is a webapp created and hosted by
Hugging Face showcasing the generative capabilities of several models. GPT is one of them.

Sylvain Gugger's avatar
Sylvain Gugger committed
41
The original code can be found `here <https://github.com/openai/finetune-transformer-lm>`__.
42

43
44
45
46
47
Note:

If you want to reproduce the original tokenization process of the `OpenAI GPT` paper, you will need to install 
``ftfy`` and ``SpaCy``::

48
49
.. code-block:: bash

50
51
52
    pip install spacy ftfy==4.4.3
    python -m spacy download en

Sylvain Gugger's avatar
Sylvain Gugger committed
53
54
If you don't install ``ftfy`` and ``SpaCy``, the :class:`~transformers.OpenAIGPTTokenizer` will default to tokenize
using BERT's :obj:`BasicTokenizer` followed by Byte-Pair Encoding (which should be fine for most usage, don't 
55
worry).
56

Lysandre's avatar
Lysandre committed
57
OpenAIGPTConfig
Sylvain Gugger's avatar
Sylvain Gugger committed
58
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
59

60
.. autoclass:: transformers.OpenAIGPTConfig
61
    :members:
62
63


Lysandre's avatar
Lysandre committed
64
OpenAIGPTTokenizer
Sylvain Gugger's avatar
Sylvain Gugger committed
65
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
66

67
.. autoclass:: transformers.OpenAIGPTTokenizer
Lysandre Debut's avatar
Lysandre Debut committed
68
    :members: save_vocabulary
69
70


71
OpenAIGPTTokenizerFast
Sylvain Gugger's avatar
Sylvain Gugger committed
72
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
73
74
75
76
77

.. autoclass:: transformers.OpenAIGPTTokenizerFast
    :members:


78
OpenAI specific outputs
Sylvain Gugger's avatar
Sylvain Gugger committed
79
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
80
81
82
83

.. autoclass:: transformers.modeling_openai.OpenAIGPTDoubleHeadsModelOutput
    :members:

Sylvain Gugger's avatar
Sylvain Gugger committed
84
85
86
.. autoclass:: transformers.modeling_tf_openai.TFOpenAIGPTDoubleHeadsModelOutput
    :members:

87

Lysandre's avatar
Lysandre committed
88
OpenAIGPTModel
Sylvain Gugger's avatar
Sylvain Gugger committed
89
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
90

91
.. autoclass:: transformers.OpenAIGPTModel
Sylvain Gugger's avatar
Sylvain Gugger committed
92
    :members: forward
93
94


Lysandre's avatar
Lysandre committed
95
OpenAIGPTLMHeadModel
Sylvain Gugger's avatar
Sylvain Gugger committed
96
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
97

98
.. autoclass:: transformers.OpenAIGPTLMHeadModel
Sylvain Gugger's avatar
Sylvain Gugger committed
99
    :members: forward
100
101


Lysandre's avatar
Lysandre committed
102
OpenAIGPTDoubleHeadsModel
Sylvain Gugger's avatar
Sylvain Gugger committed
103
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
104

105
.. autoclass:: transformers.OpenAIGPTDoubleHeadsModel
Sylvain Gugger's avatar
Sylvain Gugger committed
106
    :members: forward
LysandreJik's avatar
LysandreJik committed
107
108


109
110
111
112
113
114
115
OpenAIGPTForSequenceClassification
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

.. autoclass:: transformers.OpenAIGPTForSequenceClassification
    :members: forward


Lysandre's avatar
Lysandre committed
116
TFOpenAIGPTModel
Sylvain Gugger's avatar
Sylvain Gugger committed
117
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
LysandreJik's avatar
LysandreJik committed
118

119
.. autoclass:: transformers.TFOpenAIGPTModel
Sylvain Gugger's avatar
Sylvain Gugger committed
120
    :members: call
LysandreJik's avatar
LysandreJik committed
121
122


Lysandre's avatar
Lysandre committed
123
TFOpenAIGPTLMHeadModel
Sylvain Gugger's avatar
Sylvain Gugger committed
124
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
LysandreJik's avatar
LysandreJik committed
125

126
.. autoclass:: transformers.TFOpenAIGPTLMHeadModel
Sylvain Gugger's avatar
Sylvain Gugger committed
127
    :members: call
LysandreJik's avatar
LysandreJik committed
128
129


Lysandre's avatar
Lysandre committed
130
TFOpenAIGPTDoubleHeadsModel
Sylvain Gugger's avatar
Sylvain Gugger committed
131
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
LysandreJik's avatar
LysandreJik committed
132

133
.. autoclass:: transformers.TFOpenAIGPTDoubleHeadsModel
Sylvain Gugger's avatar
Sylvain Gugger committed
134
    :members: call