Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
296252c4
Unverified
Commit
296252c4
authored
Mar 30, 2020
by
Patrick von Platen
Committed by
GitHub
Mar 30, 2020
Browse files
fix lm lables in docstring (#3529)
parent
75ec6c9e
Changes
2
Hide whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
5 additions
and
8 deletions
+5
-8
src/transformers/modeling_t5.py
src/transformers/modeling_t5.py
+3
-1
src/transformers/modeling_tf_t5.py
src/transformers/modeling_tf_t5.py
+2
-7
No files found.
src/transformers/modeling_t5.py
View file @
296252c4
...
@@ -900,8 +900,10 @@ class T5ForConditionalGeneration(T5PreTrainedModel):
...
@@ -900,8 +900,10 @@ class T5ForConditionalGeneration(T5PreTrainedModel):
r
"""
r
"""
lm_labels (:obj:`torch.LongTensor` of shape :obj:`(batch_size,)`, `optional`, defaults to :obj:`None`):
lm_labels (:obj:`torch.LongTensor` of shape :obj:`(batch_size,)`, `optional`, defaults to :obj:`None`):
Labels for computing the sequence classification/regression loss.
Labels for computing the sequence classification/regression loss.
Indices should be in :obj:`[0, ..., config.vocab_size - 1]`.
Indices should be in :obj:`[
-100,
0, ..., config.vocab_size - 1]`.
If :obj:`config.num_labels > 1` a classification loss is computed (Cross-Entropy).
If :obj:`config.num_labels > 1` a classification loss is computed (Cross-Entropy).
All labels set to ``-100`` are ignored (masked), the loss is only
computed for labels in ``[0, ..., config.vocab_size]``
Returns:
Returns:
:obj:`tuple(torch.FloatTensor)` comprising various elements depending on the configuration (:class:`~transformers.T5Config`) and inputs.
:obj:`tuple(torch.FloatTensor)` comprising various elements depending on the configuration (:class:`~transformers.T5Config`) and inputs.
...
...
src/transformers/modeling_tf_t5.py
View file @
296252c4
...
@@ -799,11 +799,6 @@ class TFT5ForConditionalGeneration(TFT5PreTrainedModel):
...
@@ -799,11 +799,6 @@ class TFT5ForConditionalGeneration(TFT5PreTrainedModel):
@
add_start_docstrings_to_callable
(
T5_INPUTS_DOCSTRING
)
@
add_start_docstrings_to_callable
(
T5_INPUTS_DOCSTRING
)
def
call
(
self
,
decoder_input_ids
,
**
kwargs
):
def
call
(
self
,
decoder_input_ids
,
**
kwargs
):
r
"""
r
"""
lm_labels (:obj:`tf.Tensor` of shape :obj:`(batch_size,)`, `optional`, defaults to :obj:`None`):
Labels for computing the sequence classification/regression loss.
Indices should be in :obj:`[0, ..., config.vocab_size - 1]`.
If :obj:`config.num_labels > 1` a classification loss is computed (Cross-Entropy).
Return:
Return:
:obj:`tuple(tf.Tensor)` comprising various elements depending on the configuration (:class:`~transformers.T5Config`) and inputs.
:obj:`tuple(tf.Tensor)` comprising various elements depending on the configuration (:class:`~transformers.T5Config`) and inputs.
loss (:obj:`tf.Tensor` of shape :obj:`(1,)`, `optional`, returned when :obj:`lm_label` is provided):
loss (:obj:`tf.Tensor` of shape :obj:`(1,)`, `optional`, returned when :obj:`lm_label` is provided):
...
@@ -828,8 +823,8 @@ class TFT5ForConditionalGeneration(TFT5PreTrainedModel):
...
@@ -828,8 +823,8 @@ class TFT5ForConditionalGeneration(TFT5PreTrainedModel):
tokenizer = T5Tokenizer.from_pretrained('t5-small')
tokenizer = T5Tokenizer.from_pretrained('t5-small')
model = TFT5ForConditionalGeneration.from_pretrained('t5-small')
model = TFT5ForConditionalGeneration.from_pretrained('t5-small')
input_ids = tokenizer.encode("Hello, my dog is cute", return_tensors="tf") # Batch size 1
input_ids = tokenizer.encode("Hello, my dog is cute", return_tensors="tf") # Batch size 1
outputs = model(input_ids, input_ids=input_ids
, lm_labels=input_ids
)
outputs = model(input_ids, input_ids=input_ids)
prediction_scores = outputs[
:1
]
prediction_scores = outputs[
0
]
tokenizer = T5Tokenizer.from_pretrained('t5-small')
tokenizer = T5Tokenizer.from_pretrained('t5-small')
model = TFT5ForConditionalGeneration.from_pretrained('t5-small')
model = TFT5ForConditionalGeneration.from_pretrained('t5-small')
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment