Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
ollama
Commits
d2a784e3
Commit
d2a784e3
authored
Sep 24, 2023
by
Jeffrey Morgan
Browse files
add `docs/linux.md`
parent
413a2e4f
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
65 additions
and
0 deletions
+65
-0
docs/linux.md
docs/linux.md
+65
-0
No files found.
docs/linux.md
0 → 100644
View file @
d2a784e3
# Installing Ollama on Linux
> Note: A one line installer for Ollama is available by running:
>
> ```
> curl https://ollama.ai/install.sh | sh
> ```
## Download the `ollama` binary
Ollama is distributed as a self-contained binary. Download it to a directory in your PATH:
```
sudo curl -L https://ollama.ai/download/ollama-linux-amd64 -o /usr/bin/ollama
```
## Install CUDA drivers (optional for Nvidia GPUs)
[
Download and install
](
https://developer.nvidia.com/cuda-downloads
)
CUDA.
Verify that the drivers are installed by running the following command, which should print details about your GPU:
```
nvidia-smi
```
## Adding Ollama as a startup service
Create a user for Ollama:
```
sudo useradd -r -s /bin/false -m -d /usr/share/ollama ollama
```
Create a service file in
`/etc/systemd/system/ollama.service`
:
```
ini
[Unit]
Description
=
Ollama Service
After
=
network-online.target
[Service]
ExecStart
=
/usr/bin/ollama serve
User
=
ollama
Group
=
ollama
Restart
=
always
RestartSec
=
3
Environment
=
"HOME=/usr/share/ollama"
[Install]
WantedBy
=
default.target
```
Then start the service:
```
sudo systemctl daemon-reload
sudo systemctl enable ollama
```
## Run a model
```
ollama run llama2
```
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment