Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
693ba2cc
Unverified
Commit
693ba2cc
authored
Sep 14, 2022
by
lewtun
Committed by
GitHub
Sep 14, 2022
Browse files
Fix GPT-NeoX doc examples (#19033)
parent
4eb36f29
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
5 additions
and
5 deletions
+5
-5
src/transformers/models/gpt_neox/modeling_gpt_neox.py
src/transformers/models/gpt_neox/modeling_gpt_neox.py
+5
-5
No files found.
src/transformers/models/gpt_neox/modeling_gpt_neox.py
View file @
693ba2cc
...
@@ -354,7 +354,7 @@ GPT_NEOX_INPUTS_DOCSTRING = r"""
...
@@ -354,7 +354,7 @@ GPT_NEOX_INPUTS_DOCSTRING = r"""
input_ids (`torch.LongTensor` of shape `({0})`):
input_ids (`torch.LongTensor` of shape `({0})`):
Indices of input sequence tokens in the vocabulary.
Indices of input sequence tokens in the vocabulary.
Indices can be obtained using [`GPTNeoXTokenizer`]. See [`PreTrainedTokenizer.encode`] and
Indices can be obtained using [`GPTNeoXTokenizer
Fast
`]. See [`PreTrainedTokenizer.encode`] and
[`PreTrainedTokenizer.__call__`] for details.
[`PreTrainedTokenizer.__call__`] for details.
[What are input IDs?](../glossary#input-ids)
[What are input IDs?](../glossary#input-ids)
...
@@ -601,13 +601,13 @@ class GPTNeoXForCausalLM(GPTNeoXPreTrainedModel):
...
@@ -601,13 +601,13 @@ class GPTNeoXForCausalLM(GPTNeoXPreTrainedModel):
Example:
Example:
```python
```python
>>> from transformers import GPTNeoXTokenizer, GPTNeoXForCausalLM, GPTNeoXConfig
>>> from transformers import GPTNeoXTokenizer
Fast
, GPTNeoXForCausalLM, GPTNeoXConfig
>>> import torch
>>> import torch
>>> tokenizer = GPTNeoXTokenizer.from_pretrained("gpt-neox-20b")
>>> tokenizer = GPTNeoXTokenizer
Fast
.from_pretrained("
EleutherAI/
gpt-neox-20b")
>>> config = GPTNeoXConfig.from_pretrained("gpt-neox-20b")
>>> config = GPTNeoXConfig.from_pretrained("
EleutherAI/
gpt-neox-20b")
>>> config.is_decoder = True
>>> config.is_decoder = True
>>> model = GPTNeoXForCausalLM.from_pretrained("gpt-neox-20b", config=config)
>>> model = GPTNeoXForCausalLM.from_pretrained("
EleutherAI/
gpt-neox-20b", config=config)
>>> inputs = tokenizer("Hello, my dog is cute", return_tensors="pt")
>>> inputs = tokenizer("Hello, my dog is cute", return_tensors="pt")
>>> outputs = model(**inputs)
>>> outputs = model(**inputs)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment