linux.md 1.85 KB
Newer Older
Jeffrey Morgan's avatar
Jeffrey Morgan committed
1
2
3
4
# Installing Ollama on Linux

> Note: A one line installer for Ollama is available by running:
>
5
> ```bash
Jeffrey Morgan's avatar
Jeffrey Morgan committed
6
7
8
9
10
11
12
> curl https://ollama.ai/install.sh | sh
> ```

## Download the `ollama` binary

Ollama is distributed as a self-contained binary. Download it to a directory in your PATH:

13
```bash
Jeffrey Morgan's avatar
Jeffrey Morgan committed
14
sudo curl -L https://ollama.ai/download/ollama-linux-amd64 -o /usr/bin/ollama
15
sudo chmod +x /usr/bin/ollama
Jeffrey Morgan's avatar
Jeffrey Morgan committed
16
17
```

18
19
20
21
## Start Ollama

Start Ollama by running `ollama serve`:

22
```bash
23
24
25
ollama serve
```

26
Once Ollama is running, run a model in another terminal session:
27

28
```bash
29
30
31
32
ollama run llama2
```

## Install CUDA drivers (optional – for Nvidia GPUs)
Jeffrey Morgan's avatar
Jeffrey Morgan committed
33
34
35
36
37

[Download and install](https://developer.nvidia.com/cuda-downloads) CUDA.

Verify that the drivers are installed by running the following command, which should print details about your GPU:

38
```bash
Jeffrey Morgan's avatar
Jeffrey Morgan committed
39
40
41
nvidia-smi
```

42
## Adding Ollama as a startup service (optional)
Jeffrey Morgan's avatar
Jeffrey Morgan committed
43
44
45

Create a user for Ollama:

46
```bash
Jeffrey Morgan's avatar
Jeffrey Morgan committed
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
sudo useradd -r -s /bin/false -m -d /usr/share/ollama ollama
```

Create a service file in `/etc/systemd/system/ollama.service`:

```ini
[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="HOME=/usr/share/ollama"

[Install]
WantedBy=default.target
```

Then start the service:

71
```bash
Jeffrey Morgan's avatar
Jeffrey Morgan committed
72
73
74
sudo systemctl daemon-reload
sudo systemctl enable ollama
```
Jeffrey Morgan's avatar
Jeffrey Morgan committed
75
76
77
78
79

### Viewing logs

To view logs of Ollama running as a startup service, run:

80
```bash
Jeffrey Morgan's avatar
Jeffrey Morgan committed
81
82
journalctl -u ollama
```
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102

## Uninstall

Remove the ollama service:
```bash
systemctl stop ollama
systemctl disable ollama
rm /etc/systemd/system/ollama.service
```

Remove the ollama binary from your bin directory (either `/usr/local/bin`, `/usr/bin`, or `/bin`):
```bash
rm /usr/local/bin/ollama
```

Remove the downloaded models and Ollama service user:
```bash
rm /usr/share/ollama
userdel ollama
```