Commit 7bdef561 authored by Timothy J. Baek's avatar Timothy J. Baek
Browse files

fix: docker container volume mount location

parent 21c7f507
...@@ -112,14 +112,14 @@ After installing Ollama, verify that Ollama is running by accessing the followin ...@@ -112,14 +112,14 @@ After installing Ollama, verify that Ollama is running by accessing the followin
If Ollama is hosted on your local machine and accessible at [http://127.0.0.1:11434/](http://127.0.0.1:11434/), run the following command: If Ollama is hosted on your local machine and accessible at [http://127.0.0.1:11434/](http://127.0.0.1:11434/), run the following command:
```bash ```bash
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v ollama-webui:/app/backend --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v ollama-webui:/app/backend/data --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main
``` ```
Alternatively, if you prefer to build the container yourself, use the following command: Alternatively, if you prefer to build the container yourself, use the following command:
```bash ```bash
docker build -t ollama-webui . docker build -t ollama-webui .
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v ollama-webui:/app/backend --name ollama-webui --restart always ollama-webui docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v ollama-webui:/app/backend/data --name ollama-webui --restart always ollama-webui
``` ```
Your Ollama Web UI should now be hosted at [http://localhost:3000](http://localhost:3000) and accessible over LAN (or Network). Enjoy! 😄 Your Ollama Web UI should now be hosted at [http://localhost:3000](http://localhost:3000) and accessible over LAN (or Network). Enjoy! 😄
...@@ -129,14 +129,14 @@ Your Ollama Web UI should now be hosted at [http://localhost:3000](http://localh ...@@ -129,14 +129,14 @@ Your Ollama Web UI should now be hosted at [http://localhost:3000](http://localh
Change `OLLAMA_API_BASE_URL` environment variable to match the external Ollama Server url: Change `OLLAMA_API_BASE_URL` environment variable to match the external Ollama Server url:
```bash ```bash
docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v ollama-webui:/app/backend --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v ollama-webui:/app/backend/data --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main
``` ```
Alternatively, if you prefer to build the container yourself, use the following command: Alternatively, if you prefer to build the container yourself, use the following command:
```bash ```bash
docker build -t ollama-webui . docker build -t ollama-webui .
docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v ollama-webui:/app/backend --name ollama-webui --restart always ollama-webui docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v ollama-webui:/app/backend/data --name ollama-webui --restart always ollama-webui
``` ```
## How to Install Without Docker ## How to Install Without Docker
......
from peewee import * from peewee import *
DB = SqliteDatabase("./ollama.db") DB = SqliteDatabase("./data/ollama.db")
DB.connect() DB.connect()
dir for backend files (db, documents, etc.)
\ No newline at end of file
...@@ -19,7 +19,7 @@ services: ...@@ -19,7 +19,7 @@ services:
image: ollama-webui:latest image: ollama-webui:latest
container_name: ollama-webui container_name: ollama-webui
volumes: volumes:
- ollama-webui:/app/backend - ollama-webui:/app/backend/data
depends_on: depends_on:
- ollama - ollama
ports: ports:
......
docker stop ollama-webui || true docker stop ollama-webui || true
docker rm ollama-webui || true docker rm ollama-webui || true
docker build -t ollama-webui . docker build -t ollama-webui .
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v ollama-webui:/app/backend --name ollama-webui --restart always ollama-webui docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v ollama-webui:/app/backend/data --name ollama-webui --restart always ollama-webui
docker image prune -f docker image prune -f
\ No newline at end of file
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment