Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
e58b3ec5
Unverified
Commit
e58b3ec5
authored
Mar 06, 2020
by
Sam Shleifer
Committed by
GitHub
Mar 06, 2020
Browse files
add imports to examples (#3160)
parent
6ffe03a0
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
2 additions
and
3 deletions
+2
-3
src/transformers/modeling_bart.py
src/transformers/modeling_bart.py
+2
-3
No files found.
src/transformers/modeling_bart.py
View file @
e58b3ec5
...
@@ -913,7 +913,7 @@ class BartForConditionalGeneration(PretrainedBartModel):
...
@@ -913,7 +913,7 @@ class BartForConditionalGeneration(PretrainedBartModel):
# Mask filling only works for bart-large
# Mask filling only works for bart-large
from transformers import BartTokenizer, BartForConditionalGeneration
from transformers import BartTokenizer, BartForConditionalGeneration
tokenizer =
Auto
Tokenizer.from_pretrained('bart-large')
tokenizer =
Bart
Tokenizer.from_pretrained('bart-large')
TXT = "My friends are <mask> but they eat too many carbs."
TXT = "My friends are <mask> but they eat too many carbs."
model = BartForConditionalGeneration.from_pretrained('bart-large')
model = BartForConditionalGeneration.from_pretrained('bart-large')
input_ids = tokenizer.batch_encode_plus([TXT], return_tensors='pt')['input_ids']
input_ids = tokenizer.batch_encode_plus([TXT], return_tensors='pt')['input_ids']
...
@@ -1031,8 +1031,7 @@ class BartForConditionalGeneration(PretrainedBartModel):
...
@@ -1031,8 +1031,7 @@ class BartForConditionalGeneration(PretrainedBartModel):
Examples::
Examples::
from transformers import BartTokenizer, BartForConditionalGeneration, BartConfig
from transformers import BartTokenizer, BartForConditionalGeneration, BartConfig
# see ``examples/summarization/bart/evaluate_cnn.py`` for a longer example
# see ``examples/summarization/bart/evaluate_cnn.py`` for a longer example
config = BartConfig(vocab_size=50264, output_past=True) # no mask_token_id
model = BartForConditionalGeneration.from_pretrained('bart-large-cnn')
model = BartForConditionalGeneration.from_pretrained('bart-large-cnn', config=config)
tokenizer = BartTokenizer.from_pretrained('bart-large-cnn')
tokenizer = BartTokenizer.from_pretrained('bart-large-cnn')
ARTICLE_TO_SUMMARIZE = "My friends are cool but they eat too many carbs."
ARTICLE_TO_SUMMARIZE = "My friends are cool but they eat too many carbs."
inputs = tokenizer.batch_encode_plus([ARTICLE_TO_SUMMARIZE], max_length=1024, return_tensors='pt')
inputs = tokenizer.batch_encode_plus([ARTICLE_TO_SUMMARIZE], max_length=1024, return_tensors='pt')
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment