Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
c198ff5f
Commit
c198ff5f
authored
Jun 01, 2019
by
VictorSanh
Browse files
fix typos/bugs
parent
592d1e3a
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
3 additions
and
3 deletions
+3
-3
hubconfs/gpt2_hubconf.py
hubconfs/gpt2_hubconf.py
+3
-3
No files found.
hubconfs/gpt2_hubconf.py
View file @
c198ff5f
...
@@ -130,7 +130,7 @@ def gpt2LMHeadModel(*args, **kwargs):
...
@@ -130,7 +130,7 @@ def gpt2LMHeadModel(*args, **kwargs):
>>> predicted_token = tokenizer.decode([predicted_index])
>>> predicted_token = tokenizer.decode([predicted_index])
>>> assert predicted_token == ' who'
>>> assert predicted_token == ' who'
"""
"""
model
=
OpenAI
GPTLMHeadModel
.
from_pretrained
(
*
args
,
**
kwargs
)
model
=
GPT
2
LMHeadModel
.
from_pretrained
(
*
args
,
**
kwargs
)
return
model
return
model
...
@@ -148,9 +148,9 @@ def gpt2DoubleHeadsModel(*args, **kwargs):
...
@@ -148,9 +148,9 @@ def gpt2DoubleHeadsModel(*args, **kwargs):
# Prepare tokenized input
# Prepare tokenized input
>>> text = "Who was Jim Henson ?"
>>> text = "Who was Jim Henson ?"
>>> indexed_tokens = tokenizer.encode(
tokenized_
text)
>>> indexed_tokens = tokenizer.encode(text)
>>> tokens_tensor = torch.tensor([indexed_tokens])
>>> tokens_tensor = torch.tensor([indexed_tokens])
>>> mc_token_ids = torch.LongTensor([ [len(
tokenized_text
)] ])
>>> mc_token_ids = torch.LongTensor([ [len(
indexed_tokens
)] ])
# Load gpt2DoubleHeadsModel
# Load gpt2DoubleHeadsModel
>>> model = torch.hub.load('huggingface/pytorch-pretrained-BERT', 'gpt2DoubleHeadsModel', 'gpt2')
>>> model = torch.hub.load('huggingface/pytorch-pretrained-BERT', 'gpt2DoubleHeadsModel', 'gpt2')
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment