Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
ollama
Commits
ee92ca3e
Unverified
Commit
ee92ca3e
authored
Aug 05, 2025
by
Jeffrey Morgan
Committed by
GitHub
Aug 05, 2025
Browse files
docs: add docs for Ollama Turbo (#11687)
parent
8253ad4d
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
107 additions
and
0 deletions
+107
-0
docs/turbo.md
docs/turbo.md
+107
-0
No files found.
docs/turbo.md
0 → 100644
View file @
ee92ca3e
# Turbo
> ⚠️ Turbo is preview
Ollama’s
[
Turbo
](
https://ollama.com/turbo
)
is a new way to run open-source models with acceleration from datacenter-grade hardware.
Currently, the following models are available in Turbo:
-
`gpt-oss:20b`
-
`gpt-oss:120b`
## Get started
### Ollama for macOS & Windows
Download Ollama
-
Select a model such as
`gpt-oss:20b`
or
`gpt-oss:120b`
-
Click on
**Turbo**
. You’ll be prompted to create an account or sign in
### Ollama’s CLI
-
[
Sign up
](
https://ollama.com/signup
)
for an Ollama account
-
Add your Ollama key
[
to ollama.com
](
https://ollama.com/settings/keys
)
.
On macOS and Linux:
```
shell
cat
~/.ollama/id_ed25519.pub
```
On Windows:
```
type "%USERPROFILE%\.ollama\id_ed25519.pub"
```
-
Then run a model setting
`OLLAMA_HOST`
to
`ollama.com`
:
```
shell
OLLAMA_HOST
=
ollama.com ollama run gpt-oss:120b
```
### Ollama’s Python library
-
Download Ollama's
[
Python library
](
https://github.com/ollama/ollama-python
)
-
[
Sign up
](
https://ollama.com/signup
)
for an Ollama account
-
Create an API key by visiting https://ollama.com/settings/keys
```
python
from
ollama
import
Client
client
=
Client
(
host
=
"https://ollama.com"
,
headers
=
{
'Authorization'
:
'<api key>'
}
)
messages
=
[
{
'role'
:
'user'
,
'content'
:
'Why is the sky blue?'
,
},
]
for
part
in
client
.
chat
(
'gpt-oss:120b'
,
messages
=
messages
,
stream
=
True
):
print
(
part
[
'message'
][
'content'
],
end
=
''
,
flush
=
True
)
```
### Ollama’s JavaScript library
-
Download Ollama's
[
JavaScript library
](
https://github.com/ollama/ollama-js
)
-
[
Sign up
](
https://ollama.com/signup
)
for an Ollama account
-
Create an API key by visiting https://ollama.com/settings/keys
```
typescript
import
{
Ollama
}
from
'
ollama
'
;
const
ollama
=
new
Ollama
({
host
:
'
https://ollama.com
'
headers
:
{
Authorization
:
"
Bearer <api key>
"
}
});
const
response
=
await
ollama
.
chat
({
model
:
'
deepseek-r1:671b
'
,
messages
:
[{
role
:
'
user
'
,
content
:
'
Explain quantum computing
'
}],
stream
:
true
});
for
await
(
const
part
of
response
)
{
process
.
stdout
.
write
(
part
.
message
.
content
)
}
```
### Community integrations
Turbo mode is also compatible with several community integrations.
#### Open WebUI
-
Go to
**settings**
→
**Admin settings**
→
**Connections**
-
Under
**Ollama API,**
click
**+**
-
For the
**URL**
put
`https://ollama.com`
-
For the
**API key,**
create an API key on https://ollama.com/settings/keys and add it.
-
Click
**Save**
Now, if you navigate to the model selector, Turbo models should be available under
**External**
.
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment