"...git@developer.sourcefind.cn:chenpangpang/open-webui.git" did not exist on "e08bf88e724b6c431bdb8d660cd9a963d2bb1678"
Commit 9ddde1f8 authored by Timothy J. Baek's avatar Timothy J. Baek
Browse files

doc: setup instructions updated

parent eff48d7e
...@@ -57,13 +57,9 @@ ChatGPT-Style Web Interface for Ollama 🦙 ...@@ -57,13 +57,9 @@ ChatGPT-Style Web Interface for Ollama 🦙
## How to Install 🚀 ## How to Install 🚀
### Prerequisites ### Installing Both Ollama and Ollama Web UI Using Docker Compose
Make sure you have the latest version of Ollama installed before proceeding with the installation. You can find the latest version of Ollama at [https://ollama.ai/](https://ollama.ai/). If you don't have Ollama installed yet, you can use the provided Docker Compose file for a hassle-free installation. Simply run the following command:
#### Installing Both Ollama and Ollama Web UI Using Docker Compose
If you don't have Ollama installed, you can also use the provided Docker Compose file for a hassle-free installation. Simply run the following command:
```bash ```bash
docker compose up --build docker compose up --build
...@@ -71,13 +67,19 @@ docker compose up --build ...@@ -71,13 +67,19 @@ docker compose up --build
This command will install both Ollama and Ollama Web UI on your system. Ensure to modify the `compose.yaml` file for GPU support if needed. This command will install both Ollama and Ollama Web UI on your system. Ensure to modify the `compose.yaml` file for GPU support if needed.
#### Checking Ollama ### Installing Ollama Web UI Only
#### Prerequisites
Make sure you have the latest version of Ollama installed before proceeding with the installation. You can find the latest version of Ollama at [https://ollama.ai/](https://ollama.ai/).
##### Checking Ollama
After installing, verify that Ollama is running by accessing the following link in your web browser: [http://127.0.0.1:11434/](http://127.0.0.1:11434/). Note that the port number may differ based on your system configuration. After installing Ollama, verify that Ollama is running by accessing the following link in your web browser: [http://127.0.0.1:11434/](http://127.0.0.1:11434/). Note that the port number may differ based on your system configuration.
### Using Docker 🐳 #### Using Docker 🐳
If Ollama is hosted on your local machine, run the following command: If Ollama is hosted on your local machine and accessible at [http://127.0.0.1:11434/](http://127.0.0.1:11434/), run the following command:
```bash ```bash
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main
...@@ -92,7 +94,7 @@ docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway --name o ...@@ -92,7 +94,7 @@ docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway --name o
Your Ollama Web UI should now be hosted at [http://localhost:3000](http://localhost:3000) and accessible over LAN (or Network). Enjoy! 😄 Your Ollama Web UI should now be hosted at [http://localhost:3000](http://localhost:3000) and accessible over LAN (or Network). Enjoy! 😄
### Accessing External Ollama on a Different Server #### Accessing External Ollama on a Different Server
Change `OLLAMA_API_BASE_URL` environment variable to match the external Ollama Server url: Change `OLLAMA_API_BASE_URL` environment variable to match the external Ollama Server url:
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment