Unverified Commit 693ba2cc authored by lewtun's avatar lewtun Committed by GitHub
Browse files

Fix GPT-NeoX doc examples (#19033)

parent 4eb36f29
...@@ -354,7 +354,7 @@ GPT_NEOX_INPUTS_DOCSTRING = r""" ...@@ -354,7 +354,7 @@ GPT_NEOX_INPUTS_DOCSTRING = r"""
input_ids (`torch.LongTensor` of shape `({0})`): input_ids (`torch.LongTensor` of shape `({0})`):
Indices of input sequence tokens in the vocabulary. Indices of input sequence tokens in the vocabulary.
Indices can be obtained using [`GPTNeoXTokenizer`]. See [`PreTrainedTokenizer.encode`] and Indices can be obtained using [`GPTNeoXTokenizerFast`]. See [`PreTrainedTokenizer.encode`] and
[`PreTrainedTokenizer.__call__`] for details. [`PreTrainedTokenizer.__call__`] for details.
[What are input IDs?](../glossary#input-ids) [What are input IDs?](../glossary#input-ids)
...@@ -601,13 +601,13 @@ class GPTNeoXForCausalLM(GPTNeoXPreTrainedModel): ...@@ -601,13 +601,13 @@ class GPTNeoXForCausalLM(GPTNeoXPreTrainedModel):
Example: Example:
```python ```python
>>> from transformers import GPTNeoXTokenizer, GPTNeoXForCausalLM, GPTNeoXConfig >>> from transformers import GPTNeoXTokenizerFast, GPTNeoXForCausalLM, GPTNeoXConfig
>>> import torch >>> import torch
>>> tokenizer = GPTNeoXTokenizer.from_pretrained("gpt-neox-20b") >>> tokenizer = GPTNeoXTokenizerFast.from_pretrained("EleutherAI/gpt-neox-20b")
>>> config = GPTNeoXConfig.from_pretrained("gpt-neox-20b") >>> config = GPTNeoXConfig.from_pretrained("EleutherAI/gpt-neox-20b")
>>> config.is_decoder = True >>> config.is_decoder = True
>>> model = GPTNeoXForCausalLM.from_pretrained("gpt-neox-20b", config=config) >>> model = GPTNeoXForCausalLM.from_pretrained("EleutherAI/gpt-neox-20b", config=config)
>>> inputs = tokenizer("Hello, my dog is cute", return_tensors="pt") >>> inputs = tokenizer("Hello, my dog is cute", return_tensors="pt")
>>> outputs = model(**inputs) >>> outputs = model(**inputs)
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment