Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
ollama
Commits
df56f1ee
"...text-generation-inference.git" did not exist on "45ecf9d04060d2acfbb3286c261b3c5a476ec297"
Unverified
Commit
df56f1ee
authored
Feb 19, 2024
by
Jeffrey Morgan
Committed by
GitHub
Feb 19, 2024
Browse files
Update faq.md
parent
0b6c6c90
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
22 additions
and
0 deletions
+22
-0
docs/faq.md
docs/faq.md
+22
-0
No files found.
docs/faq.md
View file @
df56f1ee
...
@@ -14,6 +14,28 @@ curl -fsSL https://ollama.com/install.sh | sh
...
@@ -14,6 +14,28 @@ curl -fsSL https://ollama.com/install.sh | sh
Review the
[
Troubleshooting
](
./troubleshooting.md
)
docs for more about using logs.
Review the
[
Troubleshooting
](
./troubleshooting.md
)
docs for more about using logs.
## How can I specify the context window size?
By default, Ollama uses a context window size of 2048 tokens.
To change this when using
`ollama run`
, use
`/set parameter`
:
```
/set parameter num_ctx 4096
```
When using the API, specify the
`num_ctx`
parameter:
```
curl http://localhost:11434/api/generate -d '{
"model": "llama2",
"prompt": "Why is the sky blue?",
"options": {
"num_ctx": 4096
}
}'
```
## How do I configure Ollama server?
## How do I configure Ollama server?
Ollama server can be configured with environment variables.
Ollama server can be configured with environment variables.
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment