Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
387217bd
"...runtime/git@developer.sourcefind.cn:change/sglang.git" did not exist on "662ecd93680c8195eda799cb9a497f93efdc521a"
Commit
387217bd
authored
Jan 13, 2020
by
Lysandre
Committed by
Lysandre Debut
Jan 14, 2020
Browse files
Added example usage
parent
7d1bb7f2
Changes
10
Hide whitespace changes
Inline
Side-by-side
Showing
10 changed files
with
157 additions
and
4 deletions
+157
-4
docs/source/model_doc/xlnet.rst
docs/source/model_doc/xlnet.rst
+2
-2
src/transformers/configuration_bert.py
src/transformers/configuration_bert.py
+17
-0
src/transformers/configuration_camembert.py
src/transformers/configuration_camembert.py
+17
-0
src/transformers/configuration_ctrl.py
src/transformers/configuration_ctrl.py
+17
-0
src/transformers/configuration_distilbert.py
src/transformers/configuration_distilbert.py
+17
-0
src/transformers/configuration_gpt2.py
src/transformers/configuration_gpt2.py
+17
-0
src/transformers/configuration_openai.py
src/transformers/configuration_openai.py
+17
-0
src/transformers/configuration_transfo_xl.py
src/transformers/configuration_transfo_xl.py
+17
-0
src/transformers/configuration_xlm.py
src/transformers/configuration_xlm.py
+18
-1
src/transformers/configuration_xlnet.py
src/transformers/configuration_xlnet.py
+18
-1
No files found.
docs/source/model_doc/xlnet.rst
View file @
387217bd
...
@@ -39,14 +39,14 @@ XLNet
...
@@ -39,14 +39,14 @@ XLNet
``XLNetForTokenClassification``
``XLNetForTokenClassification``
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. autoclass:: transformers.XLNetForTokenClassification
`
.. autoclass:: transformers.XLNetForTokenClassification
:members:
:members:
``XLNetForMultipleChoice``
``XLNetForMultipleChoice``
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. autoclass:: transformers.XLNetForMultipleChoice
`
.. autoclass:: transformers.XLNetForMultipleChoice
:members:
:members:
...
...
src/transformers/configuration_bert.py
View file @
387217bd
...
@@ -88,6 +88,23 @@ class BertConfig(PretrainedConfig):
...
@@ -88,6 +88,23 @@ class BertConfig(PretrainedConfig):
The standard deviation of the truncated_normal_initializer for initializing all weight matrices.
The standard deviation of the truncated_normal_initializer for initializing all weight matrices.
layer_norm_eps (:obj:`float`, optional, defaults to 1e-12):
layer_norm_eps (:obj:`float`, optional, defaults to 1e-12):
The epsilon used by the layer normalization layers.
The epsilon used by the layer normalization layers.
Example::
from transformers import BertModel, BertConfig
# Initializing a BERT bert-base-uncased style configuration
configuration = BertConfig()
# Initializing a model from the bert-base-uncased style configuration
model = BertModel(configuration)
# Accessing the model configuration
configuration = model.config
Attributes:
pretrained_config_archive_map (Dict[str, str]):
A dictionary containing all the available pre-trained checkpoints.
"""
"""
pretrained_config_archive_map
=
BERT_PRETRAINED_CONFIG_ARCHIVE_MAP
pretrained_config_archive_map
=
BERT_PRETRAINED_CONFIG_ARCHIVE_MAP
...
...
src/transformers/configuration_camembert.py
View file @
387217bd
...
@@ -41,5 +41,22 @@ class CamembertConfig(RobertaConfig):
...
@@ -41,5 +41,22 @@ class CamembertConfig(RobertaConfig):
The :class:`~transformers.CamembertConfig` class directly inherits :class:`~transformers.BertConfig`.
The :class:`~transformers.CamembertConfig` class directly inherits :class:`~transformers.BertConfig`.
It reuses the same defaults. Please check the parent class for more information.
It reuses the same defaults. Please check the parent class for more information.
Example::
from transformers import CamembertModel, CamembertConfig
# Initializing a CamemBERT configuration
configuration = CamembertConfig()
# Initializing a model from the configuration
model = CamembertModel(configuration)
# Accessing the model configuration
configuration = model.config
Attributes:
pretrained_config_archive_map (Dict[str, str]):
A dictionary containing all the available pre-trained checkpoints.
"""
"""
pretrained_config_archive_map
=
CAMEMBERT_PRETRAINED_CONFIG_ARCHIVE_MAP
pretrained_config_archive_map
=
CAMEMBERT_PRETRAINED_CONFIG_ARCHIVE_MAP
src/transformers/configuration_ctrl.py
View file @
387217bd
...
@@ -63,6 +63,23 @@ class CTRLConfig(PretrainedConfig):
...
@@ -63,6 +63,23 @@ class CTRLConfig(PretrainedConfig):
The epsilon to use in the layer normalization layers
The epsilon to use in the layer normalization layers
initializer_range (:obj:`float`, optional, defaults to 0.02):
initializer_range (:obj:`float`, optional, defaults to 0.02):
The standard deviation of the truncated_normal_initializer for initializing all weight matrices.
The standard deviation of the truncated_normal_initializer for initializing all weight matrices.
Example::
from transformers import CTRLModel, CTRLConfig
# Initializing a CTRL configuration
configuration = CTRLConfig()
# Initializing a model from the configuration
model = CTRLModel(configuration)
# Accessing the model configuration
configuration = model.config
Attributes:
pretrained_config_archive_map (Dict[str, str]):
A dictionary containing all the available pre-trained checkpoints.
"""
"""
pretrained_config_archive_map
=
CTRL_PRETRAINED_CONFIG_ARCHIVE_MAP
pretrained_config_archive_map
=
CTRL_PRETRAINED_CONFIG_ARCHIVE_MAP
...
...
src/transformers/configuration_distilbert.py
View file @
387217bd
...
@@ -74,6 +74,23 @@ class DistilBertConfig(PretrainedConfig):
...
@@ -74,6 +74,23 @@ class DistilBertConfig(PretrainedConfig):
seq_classif_dropout (:obj:`float`, optional, defaults to 0.2):
seq_classif_dropout (:obj:`float`, optional, defaults to 0.2):
The dropout probabilities used in the sequence classification model
The dropout probabilities used in the sequence classification model
:class:`~tranformers.DistilBertForSequenceClassification`.
:class:`~tranformers.DistilBertForSequenceClassification`.
Example::
from transformers import DistilBertModel, DistilBertConfig
# Initializing a DistilBERT configuration
configuration = DistilBertConfig()
# Initializing a model from the configuration
model = DistilBertModel(configuration)
# Accessing the model configuration
configuration = model.config
Attributes:
pretrained_config_archive_map (Dict[str, str]):
A dictionary containing all the available pre-trained checkpoints.
"""
"""
pretrained_config_archive_map
=
DISTILBERT_PRETRAINED_CONFIG_ARCHIVE_MAP
pretrained_config_archive_map
=
DISTILBERT_PRETRAINED_CONFIG_ARCHIVE_MAP
...
...
src/transformers/configuration_gpt2.py
View file @
387217bd
...
@@ -94,6 +94,23 @@ class GPT2Config(PretrainedConfig):
...
@@ -94,6 +94,23 @@ class GPT2Config(PretrainedConfig):
Argument used when doing sequence summary. Used in for the multiple choice head in
Argument used when doing sequence summary. Used in for the multiple choice head in
:class:`~transformers.GPT2DoubleHeadsModel`.
:class:`~transformers.GPT2DoubleHeadsModel`.
Add a dropout before the projection and activation
Add a dropout before the projection and activation
Example::
from transformers import GPT2Model, GPT2Config
# Initializing a GPT2 configuration
configuration = GPT2Config()
# Initializing a model from the configuration
model = GPT2Model(configuration)
# Accessing the model configuration
configuration = model.config
Attributes:
pretrained_config_archive_map (Dict[str, str]):
A dictionary containing all the available pre-trained checkpoints.
"""
"""
pretrained_config_archive_map
=
GPT2_PRETRAINED_CONFIG_ARCHIVE_MAP
pretrained_config_archive_map
=
GPT2_PRETRAINED_CONFIG_ARCHIVE_MAP
...
...
src/transformers/configuration_openai.py
View file @
387217bd
...
@@ -94,6 +94,23 @@ class OpenAIGPTConfig(PretrainedConfig):
...
@@ -94,6 +94,23 @@ class OpenAIGPTConfig(PretrainedConfig):
Argument used when doing sequence summary. Used in for the multiple choice head in
Argument used when doing sequence summary. Used in for the multiple choice head in
:class:`~transformers.OpenAIGPTDoubleHeadsModel`.
:class:`~transformers.OpenAIGPTDoubleHeadsModel`.
Add a dropout before the projection and activation
Add a dropout before the projection and activation
Example::
from transformers import OpenAIGPTConfig, OpenAIGPTModel
# Initializing a GPT configuration
configuration = OpenAIGPTConfig()
# Initializing a model from the configuration
model = OpenAIGPTModel(configuration)
# Accessing the model configuration
configuration = model.config
Attributes:
pretrained_config_archive_map (Dict[str, str]):
A dictionary containing all the available pre-trained checkpoints.
"""
"""
pretrained_config_archive_map
=
OPENAI_GPT_PRETRAINED_CONFIG_ARCHIVE_MAP
pretrained_config_archive_map
=
OPENAI_GPT_PRETRAINED_CONFIG_ARCHIVE_MAP
...
...
src/transformers/configuration_transfo_xl.py
View file @
387217bd
...
@@ -97,6 +97,23 @@ class TransfoXLConfig(PretrainedConfig):
...
@@ -97,6 +97,23 @@ class TransfoXLConfig(PretrainedConfig):
Parameters initialized by N(0, init_std)
Parameters initialized by N(0, init_std)
layer_norm_epsilon (:obj:`float`, optional, defaults to 1e-5):
layer_norm_epsilon (:obj:`float`, optional, defaults to 1e-5):
The epsilon to use in the layer normalization layers
The epsilon to use in the layer normalization layers
Example::
from transformers import TransfoXLConfig, TransfoXLModel
# Initializing a Transformer XL configuration
configuration = TransfoXLConfig()
# Initializing a model from the configuration
model = TransfoXLModel(configuration)
# Accessing the model configuration
configuration = model.config
Attributes:
pretrained_config_archive_map (Dict[str, str]):
A dictionary containing all the available pre-trained checkpoints.
"""
"""
pretrained_config_archive_map
=
TRANSFO_XL_PRETRAINED_CONFIG_ARCHIVE_MAP
pretrained_config_archive_map
=
TRANSFO_XL_PRETRAINED_CONFIG_ARCHIVE_MAP
...
...
src/transformers/configuration_xlm.py
View file @
387217bd
...
@@ -129,7 +129,7 @@ class XLMConfig(PretrainedConfig):
...
@@ -129,7 +129,7 @@ class XLMConfig(PretrainedConfig):
:class:`~transformers.XLMForSequenceClassification`.
:class:`~transformers.XLMForSequenceClassification`.
Add a dropout before the projection and activation
Add a dropout before the projection and activation
start_n_top (:obj:`int`, optional, defaults to 5):
start_n_top (:obj:`int`, optional, defaults to 5):
Used in the SQuAD evaluation script for XLM and XLNet
V
.
Used in the SQuAD evaluation script for XLM and XLNet.
end_n_top (:obj:`int`, optional, defaults to 5):
end_n_top (:obj:`int`, optional, defaults to 5):
Used in the SQuAD evaluation script for XLM and XLNet.
Used in the SQuAD evaluation script for XLM and XLNet.
mask_token_id (:obj:`int`, optional, defaults to 0):
mask_token_id (:obj:`int`, optional, defaults to 0):
...
@@ -137,6 +137,23 @@ class XLMConfig(PretrainedConfig):
...
@@ -137,6 +137,23 @@ class XLMConfig(PretrainedConfig):
lang_id (:obj:`int`, optional, defaults to 1):
lang_id (:obj:`int`, optional, defaults to 1):
The ID of the language used by the model. This parameter is used when generating
The ID of the language used by the model. This parameter is used when generating
text in a given language.
text in a given language.
Example::
from transformers import XLMConfig, XLMModel
# Initializing a XLM configuration
configuration = XLMConfig()
# Initializing a model from the configuration
model = XLMModel(configuration)
# Accessing the model configuration
configuration = model.config
Attributes:
pretrained_config_archive_map (Dict[str, str]):
A dictionary containing all the available pre-trained checkpoints.
"""
"""
pretrained_config_archive_map
=
XLM_PRETRAINED_CONFIG_ARCHIVE_MAP
pretrained_config_archive_map
=
XLM_PRETRAINED_CONFIG_ARCHIVE_MAP
...
...
src/transformers/configuration_xlnet.py
View file @
387217bd
...
@@ -106,9 +106,26 @@ class XLNetConfig(PretrainedConfig):
...
@@ -106,9 +106,26 @@ class XLNetConfig(PretrainedConfig):
:class:`~transformers.XLNetForSequenceClassification` and :class:`~transformers.XLNetForMultipleChoice`.
:class:`~transformers.XLNetForSequenceClassification` and :class:`~transformers.XLNetForMultipleChoice`.
Add a dropout after the projection and activation
Add a dropout after the projection and activation
start_n_top (:obj:`int`, optional, defaults to 5):
start_n_top (:obj:`int`, optional, defaults to 5):
Used in the SQuAD evaluation script for XLM and XLNet
V
.
Used in the SQuAD evaluation script for XLM and XLNet.
end_n_top (:obj:`int`, optional, defaults to 5):
end_n_top (:obj:`int`, optional, defaults to 5):
Used in the SQuAD evaluation script for XLM and XLNet.
Used in the SQuAD evaluation script for XLM and XLNet.
Example::
from transformers import XLNetConfig, XLNetModel
# Initializing a XLNet configuration
configuration = XLNetConfig()
# Initializing a model from the configuration
model = XLNetModel(configuration)
# Accessing the model configuration
configuration = model.config
Attributes:
pretrained_config_archive_map (Dict[str, str]):
A dictionary containing all the available pre-trained checkpoints.
"""
"""
pretrained_config_archive_map
=
XLNET_PRETRAINED_CONFIG_ARCHIVE_MAP
pretrained_config_archive_map
=
XLNET_PRETRAINED_CONFIG_ARCHIVE_MAP
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment