"sgl-kernel/vscode:/vscode.git/clone" did not exist on "311de47bb7a8baf646f58a472ea1b6712bd51ff6"
faq.md 3.26 KB
Newer Older
1
2
# FAQ

3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
## How can I view the logs?

On macOS:

```
cat ~/.ollama/logs/server.log
```

On Linux:

```
journalctl -u ollama
```

If you're running `ollama serve` directly, the logs will be printed to the console.

Jeffrey Morgan's avatar
Jeffrey Morgan committed
19
## How can I expose Ollama on my network?
20

Michael Yang's avatar
Michael Yang committed
21
22
23
24
Ollama binds to 127.0.0.1 port 11434 by default. Change the bind address with the `OLLAMA_HOST` environment variable.

On macOS:

25
```bash
26
27
28
OLLAMA_HOST=0.0.0.0:11435 ollama serve
```

Michael Yang's avatar
Michael Yang committed
29
30
On Linux:

Jeffrey Morgan's avatar
Jeffrey Morgan committed
31
Create a `systemd` drop-in directory and set `Environment=OLLAMA_HOST`
Michael Yang's avatar
Michael Yang committed
32
33
34
35
36
37
38
39
40
41

```bash
mkdir -p /etc/systemd/system/ollama.service.d
echo "[Service]" >>/etc/systemd/system/ollama.service.d/environment.conf
```

```bash
echo "Environment=OLLAMA_HOST=0.0.0.0:11434" >>/etc/systemd/system/ollama.service.d/environment.conf
```

Jeffrey Morgan's avatar
Jeffrey Morgan committed
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
Reload `systemd` and restart Ollama:

```bash
systemctl daemon-reload
systemctl restart ollama
```

## How can I allow additional web origins to access Ollama?

Ollama allows cross origin requests from `127.0.0.1` and `0.0.0.0` by default. Add additional origins with the `OLLAMA_ORIGINS` environment variable:

On macOS:

```bash
OLLAMA_ORIGINS=http://192.168.1.1:*,https://example.com ollama serve
```

On Linux:

Michael Yang's avatar
Michael Yang committed
61
```bash
Michael Yang's avatar
Michael Yang committed
62
echo "Environment=OLLAMA_ORIGINS=http://129.168.1.1:*,https://example.com" >>/etc/systemd/system/ollama.service.d/environment.conf
Michael Yang's avatar
Michael Yang committed
63
64
```

Jeffrey Morgan's avatar
Jeffrey Morgan committed
65
Reload `systemd` and restart Ollama:
Michael Yang's avatar
Michael Yang committed
66
67
68
69
70
71

```bash
systemctl daemon-reload
systemctl restart ollama
```

72
73
## Where are models stored?

74
75
- macOS: Raw model data is stored under `~/.ollama/models`.
- Linux: Raw model data is stored under `/usr/share/ollama/.ollama/models`
76

77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
Below the models directory you will find a structure similar to the following:

```shell
.
├── blobs
└── manifests
   └── registry.ollama.ai
      ├── f0rodo
      ├── library
      ├── mattw
      └── saikatkumardey
```

There is a `manifests/registry.ollama.ai/namespace` path. In example above, the user has downloaded models from the official `library`, `f0rodo`, `mattw`, and `saikatkumardey` namespaces. Within each of those directories, you will find directories for each of the models downloaded. And in there you will find a file name representing each tag. Each tag file is the manifest for the model.  

The manifest lists all the layers used in this model. You will see a `media type` for each layer, along with a digest. That digest corresponds with a file in the `models/blobs directory`.

94
95
96
### How can I change where Ollama stores models?

To modify where models are stored, you can use the `OLLAMA_MODELS` environment variable. Note that on Linux this means defining `OLLAMA_MODELS` in a drop-in `/etc/systemd/system/ollama.service.d` service file, reloading systemd, and restarting the ollama service.
97
98
99
100
101

## Does Ollama send my prompts and answers back to Ollama.ai to use in any way?

No. Anything you do with Ollama, such as generate a response from the model, stays with you. We don't collect any data about how you use the model. You are always in control of your own data.

Jeffrey Morgan's avatar
Jeffrey Morgan committed
102
## How can I use Ollama in Visual Studio Code?
103

Jeffrey Morgan's avatar
Jeffrey Morgan committed
104
There is already a large collection of plugins available for VSCode as well as other editors that leverage Ollama. You can see the list of [extensions & plugins](https://github.com/jmorganca/ollama#extensions--plugins) at the bottom of the main repository readme.