Commit 92119de9 authored by Michael Yang's avatar Michael Yang
Browse files

update linux.md

parent 53b0ba8d
# Installing Ollama on Linux # Ollama on Linux
> Note: A one line installer for Ollama is available by running: ## Install
>
> ```bash
> curl https://ollama.ai/install.sh | sh
> ```
## Download the `ollama` binary
Ollama is distributed as a self-contained binary. Download it to a directory in your PATH:
```bash
sudo curl -L https://ollama.ai/download/ollama-linux-amd64 -o /usr/bin/ollama
sudo chmod +x /usr/bin/ollama
```
## Start Ollama
Start Ollama by running `ollama serve`:
```bash
ollama serve
```
Once Ollama is running, run a model in another terminal session:
Install Ollama running this one-liner:
>
```bash ```bash
ollama run llama2 curl https://ollama.ai/install.sh | sh
``` ```
## Install CUDA drivers (optional – for Nvidia GPUs) ## Manual install
[Download and install](https://developer.nvidia.com/cuda-downloads) CUDA. ### Download the `ollama` binary
Verify that the drivers are installed by running the following command, which should print details about your GPU: Ollama is distributed as a self-contained binary. Download it to a directory in your PATH:
```bash ```bash
nvidia-smi sudo curl -L https://ollama.ai/download/ollama-linux-amd64 -o /usr/bin/ollama
sudo chmod +x /usr/bin/ollama
``` ```
## Adding Ollama as a startup service (optional) ### Adding Ollama as a startup service (recommended)
Create a user for Ollama: Create a user for Ollama:
...@@ -60,7 +40,6 @@ User=ollama ...@@ -60,7 +40,6 @@ User=ollama
Group=ollama Group=ollama
Restart=always Restart=always
RestartSec=3 RestartSec=3
Environment="HOME=/usr/share/ollama"
[Install] [Install]
WantedBy=default.target WantedBy=default.target
...@@ -73,7 +52,40 @@ sudo systemctl daemon-reload ...@@ -73,7 +52,40 @@ sudo systemctl daemon-reload
sudo systemctl enable ollama sudo systemctl enable ollama
``` ```
### Viewing logs ### Install CUDA drivers (optional – for Nvidia GPUs)
[Download and install](https://developer.nvidia.com/cuda-downloads) CUDA.
Verify that the drivers are installed by running the following command, which should print details about your GPU:
```bash
nvidia-smi
```
### Start Ollama
Start Ollama using `systemd`:
```bash
sudo systemctl start ollama
```
## Update
Update ollama by running the install script again:
```bash
curl https://ollama.ai/install.sh | sh
```
Or by downloading the ollama binary:
```bash
sudo curl -L https://ollama.ai/download/ollama-linux-amd64 -o /usr/bin/ollama
sudo chmod +x /usr/bin/ollama
```
## Viewing logs
To view logs of Ollama running as a startup service, run: To view logs of Ollama running as a startup service, run:
...@@ -84,19 +96,21 @@ journalctl -u ollama ...@@ -84,19 +96,21 @@ journalctl -u ollama
## Uninstall ## Uninstall
Remove the ollama service: Remove the ollama service:
```bash ```bash
systemctl stop ollama sudo systemctl stop ollama
systemctl disable ollama sudo systemctl disable ollama
rm /etc/systemd/system/ollama.service sudo rm /etc/systemd/system/ollama.service
``` ```
Remove the ollama binary from your bin directory (either `/usr/local/bin`, `/usr/bin`, or `/bin`): Remove the ollama binary from your bin directory (either `/usr/local/bin`, `/usr/bin`, or `/bin`):
```bash ```bash
rm /usr/local/bin/ollama sudo rm $(which ollama)
``` ```
Remove the downloaded models and Ollama service user: Remove the downloaded models and Ollama service user:
```bash ```bash
rm /usr/share/ollama sudo rm -r /usr/share/ollama
userdel ollama sudo userdel ollama
``` ```
...@@ -89,7 +89,6 @@ User=ollama ...@@ -89,7 +89,6 @@ User=ollama
Group=ollama Group=ollama
Restart=always Restart=always
RestartSec=3 RestartSec=3
Environment="HOME=/usr/share/ollama"
Environment="PATH=$PATH" Environment="PATH=$PATH"
[Install] [Install]
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment