Unverified Commit 934dd9e1 authored by Parth Sareen's avatar Parth Sareen Committed by GitHub
Browse files

docs: add reference to docs.ollama.com (#12800)

parent 1188f408
---
title: VS Code
---
## Install
Install [VSCode](https://code.visualstudio.com/download).
## Usage with Ollama
1. Open Copilot side bar found in top right window
<div style={{ display: 'flex', justifyContent: 'center' }}>
<img
src="/images/vscode-sidebar.png"
alt="VSCode chat Sidebar"
width="75%"
/>
</div>
2. Select the model drowpdown > **Manage models**
<div style={{ display: 'flex', justifyContent: 'center' }}>
<img
src="/images/vscode-models.png"
alt="VSCode model picker"
width="75%"
/>
</div>
3. Enter **Ollama** under **Provider Dropdown** and select desired models (e.g `qwen3, qwen3-coder:480b-cloud`)
<div style={{ display: 'flex', justifyContent: 'center' }}>
<img
src="/images/vscode-model-options.png"
alt="VSCode model options dropdown"
width="75%"
/>
</div>
---
title: Xcode
---
## Install
Install [XCode](https://developer.apple.com/xcode/)
## Usage with Ollama
<Note> Ensure Apple Intelligence is setup and the latest XCode version is v26.0 </Note>
1. Click **XCode** in top left corner > **Settings**
<div style={{ display: 'flex', justifyContent: 'center' }}>
<img
src="/images/xcode-intelligence-window.png"
alt="Xcode Intelligence window"
width="50%"
/>
</div>
2. Select **Locally Hosted**, enter port **11434** and click **Add**
<div style={{ display: 'flex', justifyContent: 'center' }}>
<img
src="/images/xcode-locally-hosted.png"
alt="Xcode settings"
width="50%"
/>
</div>
3. Select the **star icon** on the top left corner and click the **dropdown**
<div style={{ display: 'flex', justifyContent: 'center' }}>
<img
src="/images/xcode-chat-icon.png"
alt="Xcode settings"
width="50%"
/>
</div>
4. Click **My Account** and select your desired model
## Connecting to ollama.com directly
1. Create an [API key](https://ollama.com/settings/keys) from ollama.com
2. Select **Internet Hosted** and enter URL as `https://ollama.com`
3. Enter your **Ollama API Key** and click **Add**
\ No newline at end of file
---
title: Zed
---
## Install
Install [Zed](https://zed.dev/download).
## Usage with Ollama
1. In Zed, click the **star icon** in the bottom-right corner, then select **Configure**.
<div style={{ display: 'flex', justifyContent: 'center' }}>
<img
src="/images/zed-settings.png"
alt="Zed star icon in bottom right corner"
width="50%"
/>
</div>
2. Under **LLM Providers**, choose **Ollama**
3. Confirm the **Host URL** is `http://localhost:11434`, then click **Connect**
4. Once connected, select a model under **Ollama**
<div style={{ display: 'flex', justifyContent: 'center' }}>
<img
src="/images/zed-ollama-dropdown.png"
alt="Zed star icon in bottom right corner"
width="50%"
/>
</div>
## Connecting to ollama.com
1. Create an [API key](https://ollama.com/settings/keys) on **ollama.com**
2. In Zed, open the **star icon** → **Configure**
3. Under **LLM Providers**, select **Ollama**
4. Set the **API URL** to `https://ollama.com`
---
title: Linux
---
## Install
To install Ollama, run the following command:
```shell
curl -fsSL https://ollama.com/install.sh | sh
```
## Manual install
<Note>
If you are upgrading from a prior version, you should remove the old libraries
with `sudo rm -rf /usr/lib/ollama` first.
</Note>
Download and extract the package:
```shell
curl -fsSL https://ollama.com/download/ollama-linux-amd64.tgz \
| sudo tar zx -C /usr
```
Start Ollama:
```shell
ollama serve
```
In another terminal, verify that Ollama is running:
```shell
ollama -v
```
### AMD GPU install
If you have an AMD GPU, also download and extract the additional ROCm package:
```shell
curl -fsSL https://ollama.com/download/ollama-linux-amd64-rocm.tgz \
| sudo tar zx -C /usr
```
### ARM64 install
Download and extract the ARM64-specific package:
```shell
curl -fsSL https://ollama.com/download/ollama-linux-arm64.tgz \
| sudo tar zx -C /usr
```
### Adding Ollama as a startup service (recommended)
Create a user and group for Ollama:
```shell
sudo useradd -r -s /bin/false -U -m -d /usr/share/ollama ollama
sudo usermod -a -G ollama $(whoami)
```
Create a service file in `/etc/systemd/system/ollama.service`:
```ini
[Unit]
Description=Ollama Service
After=network-online.target
[Service]
ExecStart=/usr/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=$PATH"
[Install]
WantedBy=multi-user.target
```
Then start the service:
```shell
sudo systemctl daemon-reload
sudo systemctl enable ollama
```
### Install CUDA drivers (optional)
[Download and install](https://developer.nvidia.com/cuda-downloads) CUDA.
Verify that the drivers are installed by running the following command, which should print details about your GPU:
```shell
nvidia-smi
```
### Install AMD ROCm drivers (optional)
[Download and Install](https://rocm.docs.amd.com/projects/install-on-linux/en/latest/tutorial/quick-start.html) ROCm v6.
### Start Ollama
Start Ollama and verify it is running:
```shell
sudo systemctl start ollama
sudo systemctl status ollama
```
<Note>
While AMD has contributed the `amdgpu` driver upstream to the official linux
kernel source, the version is older and may not support all ROCm features. We
recommend you install the latest driver from
https://www.amd.com/en/support/linux-drivers for best support of your Radeon
GPU.
</Note>
## Customizing
To customize the installation of Ollama, you can edit the systemd service file or the environment variables by running:
```shell
sudo systemctl edit ollama
```
Alternatively, create an override file manually in `/etc/systemd/system/ollama.service.d/override.conf`:
```ini
[Service]
Environment="OLLAMA_DEBUG=1"
```
## Updating
Update Ollama by running the install script again:
```shell
curl -fsSL https://ollama.com/install.sh | sh
```
Or by re-downloading Ollama:
```shell
curl -fsSL https://ollama.com/download/ollama-linux-amd64.tgz \
| sudo tar zx -C /usr
```
## Installing specific versions
Use `OLLAMA_VERSION` environment variable with the install script to install a specific version of Ollama, including pre-releases. You can find the version numbers in the [releases page](https://github.com/ollama/ollama/releases).
For example:
```shell
curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.5.7 sh
```
## Viewing logs
To view logs of Ollama running as a startup service, run:
```shell
journalctl -e -u ollama
```
## Uninstall
Remove the ollama service:
```shell
sudo systemctl stop ollama
sudo systemctl disable ollama
sudo rm /etc/systemd/system/ollama.service
```
Remove ollama libraries from your lib directory (either `/usr/local/lib`, `/usr/lib`, or `/lib`):
```shell
sudo rm -r $(which ollama | tr 'bin' 'lib')
```
Remove the ollama binary from your bin directory (either `/usr/local/bin`, `/usr/bin`, or `/bin`):
```shell
sudo rm $(which ollama)
```
Remove the downloaded models and Ollama service user and group:
```shell
sudo userdel ollama
sudo groupdel ollama
sudo rm -r /usr/share/ollama
```
<svg width="28" height="28" viewBox="0 0 28 28" fill="none" xmlns="http://www.w3.org/2000/svg">
<path fill-rule="evenodd" clip-rule="evenodd" d="M7.25558 0.114339C7.61134 0.222519 7.93252 0.400698 8.22405 0.636149C8.70993 1.0256 9.12005 1.58303 9.433 2.24356C9.74758 2.90792 9.95182 3.64354 10.0292 4.38171C11.0662 3.9284 12.2171 3.65235 13.4041 3.57227L13.4881 3.56718C14.921 3.47809 16.3375 3.6779 17.5728 4.17044C17.7391 4.2379 17.9022 4.31044 18.062 4.3868C18.1443 3.66263 18.3453 2.94355 18.6549 2.29447C18.9678 1.63266 19.378 1.07651 19.8622 0.685785C20.1328 0.459579 20.4638 0.281532 20.8323 0.163974C21.2556 0.0367035 21.7053 0.0137947 22.1434 0.110521C22.8039 0.255609 23.3704 0.578877 23.8168 1.04851C24.2253 1.47739 24.5316 2.0272 24.7408 2.68646C25.1196 3.87517 25.1855 5.43933 24.9302 7.32549L25.0175 7.37639L25.0603 7.40058C26.3072 8.13366 27.1752 9.17855 27.6348 10.3914C28.3512 12.284 27.9905 14.4068 26.7552 15.5943L26.7255 15.621L26.7288 15.6248C27.4157 16.5946 27.8324 17.6192 27.9214 18.6793L27.9246 18.7175C28.0301 20.0729 27.5952 21.4373 26.5839 22.7774L26.5723 22.7902L26.5888 22.8207C27.3663 24.2932 27.6101 25.7759 27.3103 27.2574L27.3004 27.307C27.254 27.5234 27.0983 27.7168 26.8677 27.8446C26.637 27.9724 26.3501 28.0246 26.07 27.9892C25.9312 27.9724 25.7982 27.9347 25.6783 27.8782C25.5585 27.8217 25.4543 27.7474 25.3717 27.6595C25.289 27.572 25.2296 27.4725 25.1968 27.3668C25.164 27.2614 25.1585 27.152 25.1806 27.0448C25.4556 25.7301 25.197 24.4116 24.39 23.0702C24.3147 22.9456 24.2812 22.8083 24.2927 22.671C24.3043 22.5338 24.3604 22.401 24.4559 22.2849L24.4624 22.2773C25.4573 21.1013 25.869 19.9482 25.7801 18.8155C25.7043 17.8241 25.2448 16.8504 24.4624 15.9226C24.3103 15.7423 24.2561 15.5229 24.3115 15.3119C24.367 15.1009 24.5277 14.9152 24.7589 14.795L24.7737 14.7874C25.174 14.585 25.5429 14.0683 25.729 13.3619C25.9344 12.5267 25.8808 11.6658 25.5726 10.8496C25.2349 9.95872 24.6173 9.21546 23.7526 8.70765C22.7726 8.12984 21.4747 7.85111 19.8326 7.9313C19.6178 7.94209 19.4039 7.90286 19.2183 7.81869C19.0327 7.73451 18.8841 7.60927 18.7916 7.45912C18.2744 6.61277 17.5201 6.00696 16.5796 5.63151C15.6767 5.2833 14.6658 5.13696 13.661 5.20897C11.6104 5.33497 9.80194 6.22841 9.26335 7.35476C9.18715 7.51329 9.05009 7.65005 8.87052 7.74673C8.69096 7.84338 8.47747 7.89535 8.25864 7.89566C6.50122 7.8982 5.14075 8.21638 4.14592 8.79037C3.28615 9.28673 2.6998 9.98036 2.39015 10.8114C2.10995 11.5937 2.07158 12.4159 2.27815 13.2118C2.46262 13.9219 2.82333 14.5099 3.23674 14.8268L3.24992 14.8357C3.5991 15.0992 3.67321 15.5103 3.42945 15.8348C2.83651 16.6264 2.39345 17.8062 2.32098 18.9402C2.23862 20.2358 2.62733 21.3609 3.50521 22.1678L3.53157 22.192C3.66406 22.3113 3.74924 22.4576 3.77701 22.6133C3.80475 22.769 3.77385 22.9276 3.68804 23.0702C2.73933 24.6432 2.4478 25.9363 2.76239 26.9545C2.81892 27.1662 2.76631 27.3867 2.61573 27.5687C2.46516 27.7509 2.22851 27.8805 1.95615 27.9299C1.68379 27.9795 1.39724 27.9446 1.15746 27.8334C0.917644 27.7219 0.743586 27.5427 0.672268 27.3337C0.272031 26.0381 0.543797 24.5541 1.45133 22.8818L1.47438 22.8373L1.46121 22.822C1.01515 22.3129 0.682282 21.7498 0.476267 21.156L0.468032 21.1318C0.218008 20.391 0.119645 19.6244 0.176502 18.86C0.248972 17.7019 0.634385 16.5157 1.20097 15.5637L1.22074 15.5306L1.21744 15.5281C0.734856 14.9961 0.377443 14.3152 0.179796 13.5618L0.17156 13.5312C-0.100765 12.4803 -0.0482896 11.3945 0.324737 10.3622C0.756268 9.19764 1.6045 8.19729 2.85462 7.47439C2.95345 7.41712 3.05721 7.35985 3.16098 7.3064C2.89909 5.40624 2.96498 3.8319 3.34545 2.63556C3.55463 1.97629 3.86263 1.42648 4.2711 0.997598C4.71581 0.529242 5.2824 0.205974 5.94287 0.0596123C6.38099 -0.0371136 6.83228 -0.0142049 7.25558 0.114339ZM14.0349 11.6832C15.5765 11.6832 16.9996 12.0816 18.0636 12.7714C19.1013 13.4421 19.7189 14.3432 19.7189 15.2405C19.7189 16.3706 19.0502 17.2513 17.8528 17.8139C16.8316 18.2911 15.4629 18.5228 13.8949 18.5228C12.233 18.5228 10.8132 18.1931 9.78876 17.5886C8.77252 16.9904 8.20264 16.1504 8.20264 15.2405C8.20264 14.3407 8.85817 13.437 9.94194 12.7638C11.0422 12.0803 12.4949 11.6832 14.0349 11.6832ZM14.0349 12.8236C12.8922 12.8159 11.7798 13.1075 10.8791 13.6508C10.1198 14.1217 9.68994 14.7136 9.68994 15.2417C9.68994 15.7865 10.0358 16.2968 10.6946 16.685C11.4441 17.1266 12.5459 17.3824 13.8949 17.3824C15.2109 17.3824 16.321 17.1953 17.077 16.8403C17.8396 16.4839 18.23 15.9672 18.23 15.2405C18.23 14.7021 17.8248 14.1077 17.105 13.6419C16.3078 13.1265 15.2274 12.8236 14.0349 12.8236ZM15.1252 14.3636L15.1318 14.3687C15.3295 14.5608 15.2883 14.8396 15.0396 14.9923L14.5587 15.285V15.8526C14.5578 15.979 14.4921 16.0999 14.376 16.1889C14.2599 16.2779 14.1029 16.3277 13.9394 16.3274C13.7758 16.3277 13.6188 16.2779 13.5027 16.1889C13.3866 16.0999 13.3209 15.979 13.3201 15.8526V15.2672L12.8737 14.9897C12.8148 14.9533 12.7659 14.9082 12.7297 14.857C12.6935 14.8059 12.6707 14.7497 12.6628 14.6917C12.6548 14.6337 12.6618 14.5751 12.6833 14.5192C12.7048 14.4633 12.7404 14.4113 12.7881 14.3661C12.8853 14.2747 13.0253 14.2166 13.1776 14.2044C13.3299 14.1923 13.4824 14.2271 13.6017 14.3012L13.9558 14.5201L14.3182 14.2987C14.4371 14.2261 14.588 14.1922 14.7388 14.2043C14.8896 14.2165 15.0282 14.2736 15.1252 14.3636ZM6.82405 11.9212C7.61134 11.9212 8.25205 12.4176 8.25205 13.0298C8.25248 13.3232 8.10217 13.6048 7.83409 13.8127C7.56602 14.0205 7.20215 14.1376 6.8224 14.1383C6.44321 14.1373 6.08 14.0202 5.81235 13.8127C5.54467 13.6051 5.3944 13.324 5.3944 13.031C5.39351 12.7376 5.54342 12.4559 5.81117 12.2478C6.07895 12.0397 6.4443 11.9223 6.82405 11.9212ZM21.1634 11.9212C21.954 11.9212 22.593 12.4176 22.593 13.0298C22.5935 13.3232 22.4432 13.6048 22.1751 13.8127C21.907 14.0205 21.5431 14.1376 21.1634 14.1383C20.7842 14.1373 20.421 14.0202 20.1533 13.8127C19.8857 13.6051 19.7354 13.324 19.7354 13.031C19.7345 12.7376 19.8844 12.4559 20.1522 12.2478C20.4199 12.0397 20.7836 11.9223 21.1634 11.9212ZM6.48969 1.6543L6.48475 1.65684C6.29392 1.72096 6.131 1.82611 6.01534 1.95975L6.0071 1.96738C5.77981 2.20793 5.58216 2.56174 5.43393 3.02628C5.15392 3.90699 5.07816 5.10206 5.22969 6.56695C5.93793 6.40405 6.7104 6.30223 7.54217 6.26532L7.55864 6.26405L7.58993 6.22077C7.6657 6.11641 7.7464 6.01587 7.8337 5.9166C8.03629 4.93534 7.86993 3.76318 7.41699 2.8061C7.19628 2.34283 6.92781 1.97884 6.67087 1.77139C6.61783 1.72827 6.55871 1.68986 6.49463 1.65684L6.48969 1.6543ZM21.5999 1.70521L21.5966 1.70648C21.5325 1.73949 21.4734 1.7779 21.4203 1.82102C21.1634 2.02847 20.8933 2.39374 20.6742 2.85701C20.1966 3.86754 20.0368 5.11734 20.2954 6.13041L20.3909 6.25387L20.4041 6.27168H20.4535C21.2709 6.27186 22.0841 6.36273 22.8681 6.5415C23.0097 5.11097 22.9307 3.94136 22.6573 3.07719C22.509 2.61265 22.3114 2.25883 22.0824 2.01829L22.0759 2.01066C21.9604 1.87654 21.7975 1.77095 21.6064 1.70648L21.5999 1.70521Z" fill="black"/>
</svg>
---
title: Modelfile Reference
---
A Modelfile is the blueprint to create and share customized models using Ollama.
## Table of Contents
- [Format](#format)
- [Examples](#examples)
- [Instructions](#instructions)
- [FROM (Required)](#from-required)
- [Build from existing model](#build-from-existing-model)
- [Build from a Safetensors model](#build-from-a-safetensors-model)
- [Build from a GGUF file](#build-from-a-gguf-file)
- [PARAMETER](#parameter)
- [Valid Parameters and Values](#valid-parameters-and-values)
- [TEMPLATE](#template)
- [Template Variables](#template-variables)
- [SYSTEM](#system)
- [ADAPTER](#adapter)
- [LICENSE](#license)
- [MESSAGE](#message)
- [Notes](#notes)
## Format
The format of the `Modelfile`:
```
# comment
INSTRUCTION arguments
```
| Instruction | Description |
| ----------------------------------- | -------------------------------------------------------------- |
| [`FROM`](#from-required) (required) | Defines the base model to use. |
| [`PARAMETER`](#parameter) | Sets the parameters for how Ollama will run the model. |
| [`TEMPLATE`](#template) | The full prompt template to be sent to the model. |
| [`SYSTEM`](#system) | Specifies the system message that will be set in the template. |
| [`ADAPTER`](#adapter) | Defines the (Q)LoRA adapters to apply to the model. |
| [`LICENSE`](#license) | Specifies the legal license. |
| [`MESSAGE`](#message) | Specify message history. |
## Examples
### Basic `Modelfile`
An example of a `Modelfile` creating a mario blueprint:
```
FROM llama3.2
# sets the temperature to 1 [higher is more creative, lower is more coherent]
PARAMETER temperature 1
# sets the context window size to 4096, this controls how many tokens the LLM can use as context to generate the next token
PARAMETER num_ctx 4096
# sets a custom system message to specify the behavior of the chat assistant
SYSTEM You are Mario from super mario bros, acting as an assistant.
```
To use this:
1. Save it as a file (e.g. `Modelfile`)
2. `ollama create choose-a-model-name -f <location of the file e.g. ./Modelfile>`
3. `ollama run choose-a-model-name`
4. Start using the model!
To view the Modelfile of a given model, use the `ollama show --modelfile` command.
```shell
ollama show --modelfile llama3.2
```
```
# Modelfile generated by "ollama show"
# To build a new Modelfile based on this one, replace the FROM line with:
# FROM llama3.2:latest
FROM /Users/pdevine/.ollama/models/blobs/sha256-00e1317cbf74d901080d7100f57580ba8dd8de57203072dc6f668324ba545f29
TEMPLATE """{{ if .System }}<|start_header_id|>system<|end_header_id|>
{{ .System }}<|eot_id|>{{ end }}{{ if .Prompt }}<|start_header_id|>user<|end_header_id|>
{{ .Prompt }}<|eot_id|>{{ end }}<|start_header_id|>assistant<|end_header_id|>
{{ .Response }}<|eot_id|>"""
PARAMETER stop "<|start_header_id|>"
PARAMETER stop "<|end_header_id|>"
PARAMETER stop "<|eot_id|>"
PARAMETER stop "<|reserved_special_token"
```
## Instructions
### FROM (Required)
The `FROM` instruction defines the base model to use when creating a model.
```
FROM <model name>:<tag>
```
#### Build from existing model
```
FROM llama3.2
```
<Card title="Base Models" href="https://github.com/ollama/ollama#model-library">
A list of available base models
</Card>
<Card title="Base Models" href="https://ollama.com/library">
Additional models can be found at
</Card>
#### Build from a Safetensors model
```
FROM <model directory>
```
The model directory should contain the Safetensors weights for a supported architecture.
Currently supported model architectures:
- Llama (including Llama 2, Llama 3, Llama 3.1, and Llama 3.2)
- Mistral (including Mistral 1, Mistral 2, and Mixtral)
- Gemma (including Gemma 1 and Gemma 2)
- Phi3
#### Build from a GGUF file
```
FROM ./ollama-model.gguf
```
The GGUF file location should be specified as an absolute path or relative to the `Modelfile` location.
### PARAMETER
The `PARAMETER` instruction defines a parameter that can be set when the model is run.
```
PARAMETER <parameter> <parametervalue>
```
#### Valid Parameters and Values
| Parameter | Description | Value Type | Example Usage |
| -------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ---------- | -------------------- |
| mirostat | Enable Mirostat sampling for controlling perplexity. (default: 0, 0 = disabled, 1 = Mirostat, 2 = Mirostat 2.0) | int | mirostat 0 |
| mirostat_eta | Influences how quickly the algorithm responds to feedback from the generated text. A lower learning rate will result in slower adjustments, while a higher learning rate will make the algorithm more responsive. (Default: 0.1) | float | mirostat_eta 0.1 |
| mirostat_tau | Controls the balance between coherence and diversity of the output. A lower value will result in more focused and coherent text. (Default: 5.0) | float | mirostat_tau 5.0 |
| num_ctx | Sets the size of the context window used to generate the next token. (Default: 2048) | int | num_ctx 4096 |
| repeat_last_n | Sets how far back for the model to look back to prevent repetition. (Default: 64, 0 = disabled, -1 = num_ctx) | int | repeat_last_n 64 |
| repeat_penalty | Sets how strongly to penalize repetitions. A higher value (e.g., 1.5) will penalize repetitions more strongly, while a lower value (e.g., 0.9) will be more lenient. (Default: 1.1) | float | repeat_penalty 1.1 |
| temperature | The temperature of the model. Increasing the temperature will make the model answer more creatively. (Default: 0.8) | float | temperature 0.7 |
| seed | Sets the random number seed to use for generation. Setting this to a specific number will make the model generate the same text for the same prompt. (Default: 0) | int | seed 42 |
| stop | Sets the stop sequences to use. When this pattern is encountered the LLM will stop generating text and return. Multiple stop patterns may be set by specifying multiple separate `stop` parameters in a modelfile. | string | stop "AI assistant:" |
| num_predict | Maximum number of tokens to predict when generating text. (Default: -1, infinite generation) | int | num_predict 42 |
| top_k | Reduces the probability of generating nonsense. A higher value (e.g. 100) will give more diverse answers, while a lower value (e.g. 10) will be more conservative. (Default: 40) | int | top_k 40 |
| top_p | Works together with top-k. A higher value (e.g., 0.95) will lead to more diverse text, while a lower value (e.g., 0.5) will generate more focused and conservative text. (Default: 0.9) | float | top_p 0.9 |
| min_p | Alternative to the top*p, and aims to ensure a balance of quality and variety. The parameter \_p* represents the minimum probability for a token to be considered, relative to the probability of the most likely token. For example, with _p_=0.05 and the most likely token having a probability of 0.9, logits with a value less than 0.045 are filtered out. (Default: 0.0) | float | min_p 0.05 |
### TEMPLATE
`TEMPLATE` of the full prompt template to be passed into the model. It may include (optionally) a system message, a user's message and the response from the model. Note: syntax may be model specific. Templates use Go [template syntax](https://pkg.go.dev/text/template).
#### Template Variables
| Variable | Description |
| ----------------- | --------------------------------------------------------------------------------------------- |
| `{{ .System }}` | The system message used to specify custom behavior. |
| `{{ .Prompt }}` | The user prompt message. |
| `{{ .Response }}` | The response from the model. When generating a response, text after this variable is omitted. |
```
TEMPLATE """{{ if .System }}<|im_start|>system
{{ .System }}<|im_end|>
{{ end }}{{ if .Prompt }}<|im_start|>user
{{ .Prompt }}<|im_end|>
{{ end }}<|im_start|>assistant
"""
```
### SYSTEM
The `SYSTEM` instruction specifies the system message to be used in the template, if applicable.
```
SYSTEM """<system message>"""
```
### ADAPTER
The `ADAPTER` instruction specifies a fine tuned LoRA adapter that should apply to the base model. The value of the adapter should be an absolute path or a path relative to the Modelfile. The base model should be specified with a `FROM` instruction. If the base model is not the same as the base model that the adapter was tuned from the behaviour will be erratic.
#### Safetensor adapter
```
ADAPTER <path to safetensor adapter>
```
Currently supported Safetensor adapters:
- Llama (including Llama 2, Llama 3, and Llama 3.1)
- Mistral (including Mistral 1, Mistral 2, and Mixtral)
- Gemma (including Gemma 1 and Gemma 2)
#### GGUF adapter
```
ADAPTER ./ollama-lora.gguf
```
### LICENSE
The `LICENSE` instruction allows you to specify the legal license under which the model used with this Modelfile is shared or distributed.
```
LICENSE """
<license text>
"""
```
### MESSAGE
The `MESSAGE` instruction allows you to specify a message history for the model to use when responding. Use multiple iterations of the MESSAGE command to build up a conversation which will guide the model to answer in a similar way.
```
MESSAGE <role> <message>
```
#### Valid roles
| Role | Description |
| --------- | ------------------------------------------------------------ |
| system | Alternate way of providing the SYSTEM message for the model. |
| user | An example message of what the user could have asked. |
| assistant | An example message of how the model should respond. |
#### Example conversation
```
MESSAGE user Is Toronto in Canada?
MESSAGE assistant yes
MESSAGE user Is Sacramento in Canada?
MESSAGE assistant no
MESSAGE user Is Ontario in Canada?
MESSAGE assistant yes
```
## Notes
- the **`Modelfile` is not case sensitive**. In the examples, uppercase instructions are used to make it easier to distinguish it from arguments.
- Instructions can be in any order. In the examples, the `FROM` instruction is first to keep it easily readable.
[1]: https://ollama.com/library
---
title: n8n
---
## Install
Install [n8n](https://docs.n8n.io/choose-n8n/).
## Using Ollama Locally
1. In the top right corner, click the dropdown and select **Create Credential**
<div style={{ display: 'flex', justifyContent: 'center' }}>
<img
src="/images/n8n-credential-creation.png"
alt="Create a n8n Credential"
width="75%"
/>
</div>
2. Under **Add new credential** select **Ollama**
<div style={{ display: 'flex', justifyContent: 'center' }}>
<img
src="/images/n8n-ollama-form.png"
alt="Select Ollama under Credential"
width="75%"
/>
</div>
3. Confirm Base URL is set to `http://localhost:11434` and click **Save**
<Note> If connecting to `http://localhost:11434` fails, use `http://127.0.0.1:11434`</Note>
4. When creating a new workflow, select **Add a first step** and select an **Ollama node**
<div style={{ display: 'flex', justifyContent: 'center' }}>
<img
src="/images/n8n-chat-node.png"
alt="Add a first step with Ollama node"
width="75%"
/>
</div>
5. Select your model of choice (e.g. `qwen3-coder`)
<div style={{ display: 'flex', justifyContent: 'center' }}>
<img
src="/images/n8n-models.png"
alt="Set up Ollama credentials"
width="75%"
/>
</div>
## Connecting to ollama.com
1. Create an [API key](https://ollama.com/settings/keys) on **ollama.com**.
2. In n8n, click **Create Credential** and select **Ollama**
4. Set the **API URL** to `https://ollama.com`
5. Enter your **API Key** and click **Save**
<svg width="17" height="25" viewBox="0 0 17 25" fill="none" xmlns="http://www.w3.org/2000/svg">
<path fill-rule="evenodd" clip-rule="evenodd" d="M4.40517 0.102088C4.62117 0.198678 4.81617 0.357766 4.99317 0.56799C5.28817 0.915712 5.53718 1.41342 5.72718 2.00318C5.91818 2.59635 6.04218 3.25316 6.08918 3.91224C6.71878 3.5075 7.41754 3.26103 8.13818 3.18953L8.18918 3.18498C9.05919 3.10544 9.91919 3.28384 10.6692 3.72361C10.7702 3.78384 10.8692 3.84861 10.9662 3.91679C11.0162 3.27021 11.1382 2.62817 11.3262 2.04863C11.5162 1.45773 11.7652 0.961166 12.0592 0.612308C12.2235 0.410338 12.4245 0.251368 12.6482 0.146406C12.9052 0.032771 13.1782 0.0123167 13.4442 0.098679C13.8452 0.228223 14.1892 0.516855 14.4602 0.936167C14.7082 1.3191 14.8942 1.81 15.0212 2.39863C15.2512 3.45998 15.2912 4.85655 15.1362 6.54061L15.1892 6.58607L15.2152 6.60766C15.9722 7.26219 16.4992 8.19513 16.7782 9.27807C17.2133 10.9678 16.9943 12.8632 16.2442 13.9235L16.2262 13.9473L16.2282 13.9507C16.6453 14.8166 16.8983 15.7314 16.9523 16.678L16.9543 16.7121C17.0183 17.9223 16.7542 19.1404 16.1402 20.337L16.1332 20.3484L16.1432 20.3756C16.6152 21.6904 16.7632 23.0142 16.5812 24.3369L16.5752 24.3813C16.547 24.5744 16.4525 24.7472 16.3125 24.8612C16.1725 24.9753 15.9983 25.0219 15.8282 24.9903C15.744 24.9753 15.6632 24.9417 15.5904 24.8912C15.5177 24.8408 15.4544 24.7744 15.4042 24.696C15.3541 24.6178 15.318 24.529 15.2981 24.4347C15.2782 24.3406 15.2748 24.2428 15.2882 24.1472C15.4552 22.9733 15.2982 21.7961 14.8082 20.5984C14.7625 20.4871 14.7422 20.3645 14.7492 20.242C14.7562 20.1194 14.7902 20.0009 14.8482 19.8972L14.8522 19.8904C15.4562 18.8404 15.7062 17.8109 15.6522 16.7996C15.6062 15.9143 15.3272 15.045 14.8522 14.2166C14.7598 14.0556 14.7269 13.8597 14.7606 13.6713C14.7943 13.4829 14.8918 13.3171 15.0322 13.2098L15.0412 13.203C15.2842 13.0223 15.5082 12.561 15.6212 11.9303C15.7459 11.1846 15.7133 10.4159 15.5262 9.68716C15.3212 8.89171 14.9462 8.22809 14.4212 7.77468C13.8262 7.25878 13.0382 7.00992 12.0412 7.08151C11.9108 7.09115 11.7809 7.05613 11.6682 6.98097C11.5556 6.90581 11.4653 6.79399 11.4092 6.65993C11.0952 5.90426 10.6372 5.36336 10.0662 5.02814C9.51799 4.71723 8.90425 4.58657 8.29418 4.65087C7.04918 4.76337 5.95118 5.56108 5.62418 6.56675C5.57792 6.70829 5.4947 6.8304 5.38568 6.91672C5.27666 7.00301 5.14703 7.04942 5.01417 7.0497C3.94717 7.05197 3.12117 7.33606 2.51717 7.84855C1.99517 8.29172 1.63916 8.91103 1.45116 9.65307C1.28104 10.3515 1.25774 11.0857 1.38316 11.7962C1.49516 12.4303 1.71416 12.9553 1.96517 13.2382L1.97317 13.2462C2.18517 13.4814 2.23017 13.8485 2.08217 14.1382C1.72216 14.845 1.45316 15.8984 1.40916 16.9109C1.35916 18.0677 1.59516 19.0722 2.12817 19.7927L2.14417 19.8143C2.22461 19.9208 2.27633 20.0514 2.29319 20.1905C2.31003 20.3295 2.29127 20.4711 2.23917 20.5984C1.66316 22.0029 1.48616 23.1574 1.67716 24.0665C1.71148 24.2556 1.67954 24.4524 1.58812 24.6149C1.4967 24.7776 1.35302 24.8933 1.18766 24.9374C1.0223 24.9817 0.848322 24.9506 0.702741 24.8512C0.557141 24.7517 0.451463 24.5917 0.408163 24.4051C0.165162 23.2483 0.330162 21.9233 0.881162 20.4302L0.895162 20.3904L0.887162 20.3768C0.616341 19.9222 0.414243 19.4195 0.289162 18.8893L0.284162 18.8677C0.132362 18.2062 0.0726416 17.5218 0.107162 16.8393C0.151162 15.8052 0.385163 14.7462 0.729162 13.8962L0.741162 13.8666L0.739162 13.8644C0.446163 13.3894 0.229162 12.7814 0.109162 12.1087L0.104162 12.0814C-0.0611788 11.1431 -0.0293187 10.1737 0.197162 9.25194C0.459163 8.21218 0.974162 7.31901 1.73316 6.67356C1.79316 6.62243 1.85616 6.57129 1.91916 6.52357C1.76016 4.827 1.80016 3.42134 2.03117 2.35317C2.15817 1.76455 2.34517 1.27365 2.59317 0.890713C2.86317 0.472537 3.20717 0.183905 3.60817 0.0532252C3.87417 -0.0331371 4.14817 -0.0126829 4.40517 0.102088ZM8.52118 10.4315C9.45719 10.4315 10.3212 10.7871 10.9672 11.403C11.5972 12.0019 11.9722 12.8064 11.9722 13.6076C11.9722 14.6166 11.5662 15.403 10.8392 15.9052C10.2192 16.3314 9.38819 16.5382 8.43618 16.5382C7.42718 16.5382 6.56518 16.2439 5.94318 15.7041C5.32618 15.17 4.98017 14.42 4.98017 13.6076C4.98017 12.8042 5.37818 11.9973 6.03618 11.3962C6.70418 10.786 7.58618 10.4315 8.52118 10.4315ZM8.52118 11.4496C7.82742 11.4428 7.15204 11.7031 6.60518 12.1883C6.14418 12.6087 5.88318 13.1371 5.88318 13.6087C5.88318 14.095 6.09318 14.5507 6.49318 14.8973C6.94818 15.2916 7.61718 15.52 8.43618 15.52C9.23519 15.52 9.90919 15.353 10.3682 15.0359C10.8312 14.7178 11.0682 14.2564 11.0682 13.6076C11.0682 13.1269 10.8222 12.5962 10.3852 12.1803C9.90119 11.7201 9.24519 11.4496 8.52118 11.4496ZM9.18319 12.8246L9.18719 12.8292C9.30719 13.0007 9.28219 13.2496 9.13119 13.386L8.83919 13.6473V14.1541C8.83865 14.267 8.79877 14.375 8.72829 14.4544C8.6578 14.5339 8.56246 14.5783 8.46318 14.578C8.3639 14.5783 8.26856 14.5339 8.19808 14.4544C8.12758 14.375 8.0877 14.267 8.08718 14.1541V13.6314L7.81618 13.3837C7.78042 13.3511 7.7507 13.3109 7.72872 13.2652C7.70674 13.2195 7.69294 13.1694 7.6881 13.1176C7.68326 13.0658 7.6875 13.0135 7.70056 12.9636C7.71362 12.9137 7.73524 12.8672 7.76418 12.8269C7.8232 12.7452 7.9082 12.6934 8.0007 12.6825C8.09318 12.6717 8.18572 12.7027 8.25818 12.7689L8.47318 12.9644L8.69318 12.7667C8.76538 12.7018 8.85702 12.6716 8.94854 12.6825C9.04009 12.6933 9.12427 12.7443 9.18319 12.8246ZM4.14317 10.644C4.62117 10.644 5.01017 11.0871 5.01017 11.6337C5.01043 11.8957 4.91917 12.1471 4.75641 12.3327C4.59365 12.5183 4.37273 12.6229 4.14217 12.6235C3.91195 12.6226 3.69143 12.518 3.52893 12.3327C3.36641 12.1474 3.27517 11.8965 3.27517 11.6349C3.27463 11.3729 3.36565 11.1213 3.52821 10.9355C3.69079 10.7497 3.91261 10.6449 4.14317 10.644ZM12.8492 10.644C13.3292 10.644 13.7172 11.0871 13.7172 11.6337C13.7175 11.8957 13.6262 12.1471 13.4634 12.3327C13.3007 12.5183 13.0798 12.6229 12.8492 12.6235C12.619 12.6226 12.3985 12.518 12.236 12.3327C12.0734 12.1474 11.9822 11.8965 11.9822 11.6349C11.9817 11.3729 12.0727 11.1213 12.2352 10.9355C12.3978 10.7497 12.6186 10.6449 12.8492 10.644ZM3.94017 1.47705L3.93717 1.47932C3.82131 1.53657 3.72239 1.63046 3.65217 1.74977L3.64717 1.75659C3.50917 1.97136 3.38917 2.28727 3.29917 2.70203C3.12917 3.48839 3.08317 4.55541 3.17517 5.86335C3.60517 5.7179 4.07417 5.62699 4.57917 5.59404L4.58917 5.5929L4.60817 5.55426C4.65417 5.46108 4.70317 5.37131 4.75617 5.28268C4.87917 4.40655 4.77817 3.35998 4.50317 2.50545C4.36917 2.09182 4.20617 1.76682 4.05017 1.5816C4.01797 1.5431 3.98207 1.5088 3.94317 1.47932L3.94017 1.47705ZM13.1142 1.52251L13.1122 1.52364C13.0733 1.55312 13.0374 1.58741 13.0052 1.62591C12.8492 1.81114 12.6852 2.13727 12.5522 2.5509C12.2622 3.45316 12.1652 4.56905 12.3222 5.47358L12.3802 5.58381L12.3882 5.59972H12.4182C12.9145 5.59988 13.4082 5.68101 13.8842 5.84062C13.9702 4.56337 13.9222 3.51907 13.7562 2.74749C13.6662 2.33272 13.5462 2.01682 13.4072 1.80205L13.4032 1.79523C13.3331 1.67548 13.2342 1.58121 13.1182 1.52364L13.1142 1.52251Z" fill="black"/>
</svg>
This diff is collapsed.
---
title: Quickstart
---
This quickstart will walk your through running your first model with Ollama. To get started, download Ollama on macOS, Windows or Linux.
<a
href="https://ollama.com/download"
target="_blank"
className="inline-block px-6 py-2 bg-black rounded-full dark:bg-neutral-700 text-white font-normal border-none"
>
Download Ollama
</a>
## Run a model
<Tabs>
<Tab title="CLI">
Open a terminal and run the command:
```
ollama run gemma3
```
</Tab>
<Tab title="cURL">
```
ollama pull gemma3
```
Lastly, chat with the model:
```shell
curl http://localhost:11434/api/chat -d '{
"model": "gemma3",
"messages": [{
"role": "user",
"content": "Hello there!"
}],
"stream": false
}'
```
</Tab>
<Tab title="Python">
Start by downloading a model:
```
ollama pull gemma3
```
Then install Ollama's Python library:
```
pip install ollama
```
Lastly, chat with the model:
```python
from ollama import chat
from ollama import ChatResponse
response: ChatResponse = chat(model='gemma3', messages=[
{
'role': 'user',
'content': 'Why is the sky blue?',
},
])
print(response['message']['content'])
# or access fields directly from the response object
print(response.message.content)
```
</Tab>
<Tab title="JavaScript">
Start by downloading a model:
```
ollama pull gemma3
```
Then install the Ollama JavaScript library:
```
npm i ollama
```
Lastly, chat with the model:
```shell
import ollama from 'ollama'
const response = await ollama.chat({
model: 'gemma3',
messages: [{ role: 'user', content: 'Why is the sky blue?' }],
})
console.log(response.message.content)
```
</Tab>
</Tabs>
See a full list of available models [here](https://ollama.com/models).
body {
font-family: ui-sans-serif, system-ui, sans-serif, Apple Color Emoji,Segoe UI Emoji,Segoe UI Symbol,Noto Color Emoji;
}
pre, code, .font-mono {
font-family: ui-monospace,SFMono-Regular,Menlo,Monaco,Consolas,monospace;
}
.nav-logo {
height: 44px;
}
.eyebrow {
color: #666;
font-weight: 400;
}
This diff is collapsed.
---
title: Text generation
---
This diff is collapsed.
---
title: Vision
description: Provide images to models
---
This diff is collapsed.
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment