Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
9a0a8c1c
Unverified
Commit
9a0a8c1c
authored
Apr 28, 2020
by
Patrick von Platen
Committed by
GitHub
Apr 28, 2020
Browse files
add examples to doc (#4045)
parent
fa49b9af
Changes
2
Hide whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
50 additions
and
12 deletions
+50
-12
src/transformers/configuration_encoder_decoder.py
src/transformers/configuration_encoder_decoder.py
+29
-12
src/transformers/modeling_encoder_decoder.py
src/transformers/modeling_encoder_decoder.py
+21
-0
No files found.
src/transformers/configuration_encoder_decoder.py
View file @
9a0a8c1c
...
@@ -27,18 +27,35 @@ class EncoderDecoderConfig(PretrainedConfig):
...
@@ -27,18 +27,35 @@ class EncoderDecoderConfig(PretrainedConfig):
r
"""
r
"""
:class:`~transformers.EncoderDecoderConfig` is the configuration class to store the configuration of a `EncoderDecoderModel`.
:class:`~transformers.EncoderDecoderConfig` is the configuration class to store the configuration of a `EncoderDecoderModel`.
It is used to instantiate an Encoder Decoder model according to the specified arguments, defining the encoder and decoder configs.
It is used to instantiate an Encoder Decoder model according to the specified arguments, defining the encoder and decoder configs.
Configuration objects inherit from :class:`~transformers.PretrainedConfig`
Configuration objects inherit from :class:`~transformers.PretrainedConfig`
and can be used to control the model outputs.
and can be used to control the model outputs.
See the documentation for :class:`~transformers.PretrainedConfig` for more information.
See the documentation for :class:`~transformers.PretrainedConfig` for more information.
Args:
Arguments:
kwargs (`optional`):
kwargs: (`optional`) Remaining dictionary of keyword arguments. Notably:
Remaining dictionary of keyword arguments. Notably:
encoder (:class:`PretrainedConfig`, optional, defaults to `None`):
encoder (:class:`PretrainedConfig`, optional, defaults to `None`):
An instance of a configuration object that defines the encoder config.
An instance of a configuration object that defines the encoder config.
encoder (:class:`PretrainedConfig`, optional, defaults to `None`):
encoder (:class:`PretrainedConfig`, optional, defaults to `None`):
An instance of a configuration object that defines the decoder config.
An instance of a configuration object that defines the decoder config.
Example::
from transformers import BertConfig, EncoderDecoderConfig, EncoderDecoderModel
# Initializing a BERT bert-base-uncased style configuration
config_encoder = BertConfig()
config_decoder = BertConfig()
config = EncoderDecoderConfig.from_encoder_decoder_configs(config_encoder, config_decoder)
# Initializing a Bert2Bert model from the bert-base-uncased style configurations
model = EncoderDecoderModel(config=config)
# Accessing the model configuration
config_encoder = model.config.encoder
config_decoder = model.config.decoder
"""
"""
model_type
=
"encoder_decoder"
model_type
=
"encoder_decoder"
...
...
src/transformers/modeling_encoder_decoder.py
View file @
9a0a8c1c
...
@@ -125,6 +125,8 @@ class EncoderDecoderModel(PreTrainedModel):
...
@@ -125,6 +125,8 @@ class EncoderDecoderModel(PreTrainedModel):
Examples::
Examples::
from tranformers import EncoderDecoder
model = EncoderDecoder.from_encoder_decoder_pretrained('bert-base-uncased', 'bert-base-uncased') # initialize Bert2Bert
model = EncoderDecoder.from_encoder_decoder_pretrained('bert-base-uncased', 'bert-base-uncased') # initialize Bert2Bert
"""
"""
...
@@ -230,6 +232,25 @@ class EncoderDecoderModel(PreTrainedModel):
...
@@ -230,6 +232,25 @@ class EncoderDecoderModel(PreTrainedModel):
kwargs: (`optional`) Remaining dictionary of keyword arguments. Keyword arguments come in two flavors:
kwargs: (`optional`) Remaining dictionary of keyword arguments. Keyword arguments come in two flavors:
- Without a prefix which will be input as `**encoder_kwargs` for the encoder forward function.
- Without a prefix which will be input as `**encoder_kwargs` for the encoder forward function.
- With a `decoder_` prefix which will be input as `**decoder_kwargs` for the decoder forward function.
- With a `decoder_` prefix which will be input as `**decoder_kwargs` for the decoder forward function.
Examples::
from transformers import EncoderDecoderModel, BertTokenizer
import torch
tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
model = EncoderDecoderModel.from_encoder_decoder_pretrained('bert-base-uncased', 'bert-base-uncased') # initialize Bert2Bert
# forward
input_ids = torch.tensor(tokenizer.encode("Hello, my dog is cute", add_special_tokens=True)).unsqueeze(0) # Batch size 1
outputs = model(input_ids=input_ids, decoder_input_ids=input_ids)
# training
loss, outputs = model(input_ids=input_ids, decoder_input_ids=input_ids, lm_labels=input_ids)[:2]
# generation
generated = model.generate(input_ids, decoder_start_token_id=model.config.decoder.pad_token_id)
"""
"""
kwargs_encoder
=
{
argument
:
value
for
argument
,
value
in
kwargs
.
items
()
if
not
argument
.
startswith
(
"decoder_"
)}
kwargs_encoder
=
{
argument
:
value
for
argument
,
value
in
kwargs
.
items
()
if
not
argument
.
startswith
(
"decoder_"
)}
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment