faq.md 5.43 KB
Newer Older
1
2
# FAQ

Matt Williams's avatar
Matt Williams committed
3
## How can I upgrade Ollama?
4

Matt Williams's avatar
Matt Williams committed
5
To upgrade Ollama, run the installation process again. On the Mac, click the Ollama icon in the menubar and choose the restart option if an update is available.
6

Matt Williams's avatar
Matt Williams committed
7
## How can I view the logs?
8

Matt Williams's avatar
Matt Williams committed
9
Review the [Troubleshooting](./troubleshooting.md) docs for more about using logs.
10

Matt Williams's avatar
Matt Williams committed
11
## How do I use Ollama server environment variables on Mac
12

Matt Williams's avatar
Matt Williams committed
13
On macOS, Ollama runs in the background and is managed by the menubar app. If adding environment variables, Ollama will need to be run manually.
14

Matt Williams's avatar
Matt Williams committed
15
16
1. Click the menubar icon for Ollama and choose **Quit Ollama**.
2. Open a new terminal window and run the following command (this example uses `OLLAMA_HOST` with an IP address of `123.1.1.1`):
17

Matt Williams's avatar
Matt Williams committed
18
19
20
   ```bash
   OLLAMA_HOST=123.1.1.1 ollama serve
   ```
Michael Yang's avatar
Michael Yang committed
21

Matt Williams's avatar
Matt Williams committed
22
## How do I use Ollama server environment variables on Linux?
Michael Yang's avatar
Michael Yang committed
23

Matt Williams's avatar
Matt Williams committed
24
If Ollama is installed with the install script, a systemd service was created, running as the Ollama user. To add an environment variable, such as OLLAMA_HOST, follow these steps:
25

Matt Williams's avatar
Matt Williams committed
26
1. Create a `systemd` drop-in directory and add a config file. This is only needed once.
Michael Yang's avatar
Michael Yang committed
27

Matt Williams's avatar
Matt Williams committed
28
29
30
31
   ```bash
   mkdir -p /etc/systemd/system/ollama.service.d
   echo '[Service]' >>/etc/systemd/system/ollama.service.d/environment.conf
   ```
Michael Yang's avatar
Michael Yang committed
32

Matt Williams's avatar
Matt Williams committed
33
2. For each environment variable, add it to the config file:
Michael Yang's avatar
Michael Yang committed
34

Matt Williams's avatar
Matt Williams committed
35
36
37
   ```bash
   echo 'Environment="OLLAMA_HOST=0.0.0.0:11434"' >>/etc/systemd/system/ollama.service.d/environment.conf
   ```
Jeffrey Morgan's avatar
Jeffrey Morgan committed
38

Matt Williams's avatar
Matt Williams committed
39
3. Reload `systemd` and restart Ollama:
Jeffrey Morgan's avatar
Jeffrey Morgan committed
40

Matt Williams's avatar
Matt Williams committed
41
42
43
44
   ```bash
   systemctl daemon-reload
   systemctl restart ollama
   ```
Jeffrey Morgan's avatar
Jeffrey Morgan committed
45

Matt Williams's avatar
Matt Williams committed
46
## How can I expose Ollama on my network?
Jeffrey Morgan's avatar
Jeffrey Morgan committed
47

Matt Williams's avatar
Matt Williams committed
48
Ollama binds to 127.0.0.1 port 11434 by default. Change the bind address with the `OLLAMA_HOST` environment variable. Refer to the section above for how to use environment variables on your platform.
Jeffrey Morgan's avatar
Jeffrey Morgan committed
49

Matt Williams's avatar
Matt Williams committed
50
## How can I allow additional web origins to access Ollama?
Jeffrey Morgan's avatar
Jeffrey Morgan committed
51

Matt Williams's avatar
Matt Williams committed
52
Ollama allows cross-origin requests from `127.0.0.1` and `0.0.0.0` by default. Add additional origins with the `OLLAMA_ORIGINS` environment variable. For example, to add all ports on 192.168.1.1 and https://example.com, use:
Jeffrey Morgan's avatar
Jeffrey Morgan committed
53

Matt Williams's avatar
Matt Williams committed
54
55
```shell
OLLAMA_ORIGINS=http://192.168.1.1:*,https://example.com
Michael Yang's avatar
Michael Yang committed
56
57
```

Matt Williams's avatar
Matt Williams committed
58
Refer to the section above for how to use environment variables on your platform.
Michael Yang's avatar
Michael Yang committed
59

60
61
## Where are models stored?

Matt Williams's avatar
Matt Williams committed
62
63
- macOS: `~/.ollama/models`.
- Linux: `/usr/share/ollama/.ollama/models`
64

Matt Williams's avatar
Matt Williams committed
65
See [the CLI Documentation](./cli.md) for more on this.
66

Matt Williams's avatar
Matt Williams committed
67
## How do I set them to a different location?
68

Matt Williams's avatar
Matt Williams committed
69
If a different directory needs to be used, set the environment variable `OLLAMA_MODELS` to the chosen directory. Refer to the section above for how to use environment variables on your platform.
70
71
72

## Does Ollama send my prompts and answers back to Ollama.ai to use in any way?

Matt Williams's avatar
Matt Williams committed
73
No, Ollama runs entirely locally, and conversation data will never leave your machine.
74

Jeffrey Morgan's avatar
Jeffrey Morgan committed
75
## How can I use Ollama in Visual Studio Code?
76

Matt Williams's avatar
Matt Williams committed
77
There is already a large collection of plugins available for VSCode as well as other editors that leverage Ollama. See the list of [extensions & plugins](https://github.com/jmorganca/ollama#extensions--plugins) at the bottom of the main repository readme.
Michael Yang's avatar
Michael Yang committed
78
79
80

## How do I use Ollama behind a proxy?

Matt Williams's avatar
Matt Williams committed
81
Ollama is compatible with proxy servers if `HTTP_PROXY` or `HTTPS_PROXY` are configured. When using either variables, ensure it is set where `ollama serve` can access the values. When using `HTTPS_PROXY`, ensure the proxy certificate is installed as a system certificate. Refer to the section above for how to use environment variables on your platform.
Michael Yang's avatar
Michael Yang committed
82
83
84

### How do I use Ollama behind a proxy in Docker?

Michael Yang's avatar
Michael Yang committed
85
The Ollama Docker container image can be configured to use a proxy by passing `-e HTTPS_PROXY=https://proxy.example.com` when starting the container.
Michael Yang's avatar
Michael Yang committed
86

Matt Williams's avatar
Matt Williams committed
87
Alternatively, the Docker daemon can be configured to use a proxy. Instructions are available for Docker Desktop on [macOS](https://docs.docker.com/desktop/settings/mac/#proxies), [Windows](https://docs.docker.com/desktop/settings/windows/#proxies), and [Linux](https://docs.docker.com/desktop/settings/linux/#proxies), and Docker [daemon with systemd](https://docs.docker.com/config/daemon/systemd/#httphttps-proxy).
Michael Yang's avatar
Michael Yang committed
88
89
90
91
92
93

Ensure the certificate is installed as a system certificate when using HTTPS. This may require a new Docker image when using a self-signed certificate.

```dockerfile
FROM ollama/ollama
COPY my-ca.pem /usr/local/share/ca-certificates/my-ca.crt
ftorto's avatar
ftorto committed
94
RUN update-ca-certificates
Michael Yang's avatar
Michael Yang committed
95
96
97
98
99
100
101
102
```

Build and run this image:

```shell
docker build -t ollama-with-ca .
docker run -d -e HTTPS_PROXY=https://my.proxy.example.com -p 11434:11434 ollama-with-ca
```
Michael Yang's avatar
Michael Yang committed
103

104
## How do I use Ollama with GPU acceleration in Docker?
Michael Yang's avatar
Michael Yang committed
105

106
The Ollama Docker container can be configured with GPU acceleration in Linux or Windows (with WSL2). This requires the [nvidia-container-toolkit](https://github.com/NVIDIA/nvidia-container-toolkit). See [ollama/ollama](https://hub.docker.com/r/ollama/ollama) for more details.
Michael Yang's avatar
Michael Yang committed
107
108

GPU acceleration is not available for Docker Desktop in macOS due to the lack of GPU passthrough and emulation.
109
110
111
112
113
114
115
116

## Why is networking slow in WSL2 on Windows 10?

This can impact both installing Ollama, as well as downloading models.

Open `Control Panel > Networking and Internet > View network status and tasks` and click on `Change adapter settings` on the left panel. Find the `vEthernel (WSL)` adapter, right click and select `Properties`.
Click on `Configure` and open the `Advanced` tab. Search through each of the properties until you find `Large Send Offload Version 2 (IPv4)` and `Large Send Offload Version 2 (IPv6)`. *Disable* both of these
properties.