linux.md 1.45 KB
Newer Older
Jeffrey Morgan's avatar
Jeffrey Morgan committed
1
2
3
4
5
6
7
8
9
10
11
12
13
14
# Installing Ollama on Linux

> Note: A one line installer for Ollama is available by running:
>
> ```
> curl https://ollama.ai/install.sh | sh
> ```

## Download the `ollama` binary

Ollama is distributed as a self-contained binary. Download it to a directory in your PATH:

```
sudo curl -L https://ollama.ai/download/ollama-linux-amd64 -o /usr/bin/ollama
15
sudo chmod +x /usr/bin/ollama
Jeffrey Morgan's avatar
Jeffrey Morgan committed
16
17
```

18
19
20
21
22
23
24
25
## Start Ollama

Start Ollama by running `ollama serve`:

```
ollama serve
```

26
Once Ollama is running, run a model in another terminal session:
27
28
29
30
31
32

```
ollama run llama2
```

## Install CUDA drivers (optional – for Nvidia GPUs)
Jeffrey Morgan's avatar
Jeffrey Morgan committed
33
34
35
36
37
38
39
40
41

[Download and install](https://developer.nvidia.com/cuda-downloads) CUDA.

Verify that the drivers are installed by running the following command, which should print details about your GPU:

```
nvidia-smi
```

42
## Adding Ollama as a startup service (optional)
Jeffrey Morgan's avatar
Jeffrey Morgan committed
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74

Create a user for Ollama:

```
sudo useradd -r -s /bin/false -m -d /usr/share/ollama ollama
```

Create a service file in `/etc/systemd/system/ollama.service`:

```ini
[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="HOME=/usr/share/ollama"

[Install]
WantedBy=default.target
```

Then start the service:

```
sudo systemctl daemon-reload
sudo systemctl enable ollama
```
Jeffrey Morgan's avatar
Jeffrey Morgan committed
75
76
77
78
79
80
81
82
83

### Viewing logs

To view logs of Ollama running as a startup service, run:

```
journalctl -u ollama
```