"...contrib/git@developer.sourcefind.cn:OpenDAS/warpctc.git" did not exist on "0bf5eb5f2631e6c148cbce452a543a2debdec3d6"
Unverified Commit 54ac39c6 authored by Francisco Kurucz's avatar Francisco Kurucz Committed by GitHub
Browse files

Fix code example to load bigcode starcoder2 7b (#32474)

parent 01645603
...@@ -25,7 +25,7 @@ class Starcoder2Config(PretrainedConfig): ...@@ -25,7 +25,7 @@ class Starcoder2Config(PretrainedConfig):
r""" r"""
This is the configuration class to store the configuration of a [`Starcoder2Model`]. It is used to instantiate a This is the configuration class to store the configuration of a [`Starcoder2Model`]. It is used to instantiate a
Starcoder2 model according to the specified arguments, defining the model architecture. Instantiating a configuration Starcoder2 model according to the specified arguments, defining the model architecture. Instantiating a configuration
with the defaults will yield a similar configuration to that of the [bigcode/starcoder2-7b_16k](https://huggingface.co/bigcode/starcoder2-7b_16k) model. with the defaults will yield a similar configuration to that of the [bigcode/starcoder2-7b](https://huggingface.co/bigcode/starcoder2-7b) model.
Configuration objects inherit from [`PretrainedConfig`] and can be used to control the model outputs. Read the Configuration objects inherit from [`PretrainedConfig`] and can be used to control the model outputs. Read the
......
...@@ -1058,8 +1058,8 @@ class Starcoder2ForCausalLM(Starcoder2PreTrainedModel): ...@@ -1058,8 +1058,8 @@ class Starcoder2ForCausalLM(Starcoder2PreTrainedModel):
```python ```python
>>> from transformers import AutoTokenizer, Starcoder2ForCausalLM >>> from transformers import AutoTokenizer, Starcoder2ForCausalLM
>>> model = Starcoder2ForCausalLM.from_pretrained("bigcode/starcoder2-7b_16k") >>> model = Starcoder2ForCausalLM.from_pretrained("bigcode/starcoder2-7b")
>>> tokenizer = AutoTokenizer.from_pretrained("bigcode/starcoder2-7b_16k") >>> tokenizer = AutoTokenizer.from_pretrained("bigcode/starcoder2-7b")
>>> prompt = "Hey, are you conscious? Can you talk to me?" >>> prompt = "Hey, are you conscious? Can you talk to me?"
>>> inputs = tokenizer(prompt, return_tensors="pt") >>> inputs = tokenizer(prompt, return_tensors="pt")
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment