@@ -75,42 +75,44 @@ This command will install both Ollama and Ollama Web UI on your system. Ensure t
After installing, verify that Ollama is running by accessing the following link in your web browser: [http://127.0.0.1:11434/](http://127.0.0.1:11434/). Note that the port number may differ based on your system configuration.
#### Accessing Ollama Web Interface over LAN
### Using Docker 🐳
If you want to access the Ollama web interface over LAN, for example, from your phone, run Ollama using the following command:
If Ollama is hosted on your local machine, run the following command:
```bash
OLLAMA_HOST=0.0.0.0 OLLAMA_ORIGINS=* ollama serve
docker run -d-p 3000:8080 --add-host=host.docker.internal:host-gateway --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main
```
In case you encounter any issues running the command and encounter errors, ensure to turn off any existing Ollama service that might be running in the background before retrying.
If you're running Ollama via Docker:
Alternatively, if you prefer to build the container yourself, use the following command:
```bash
docker run -d-v ollama:/root/.ollama -p 11434:11434 -eOLLAMA_ORIGINS="*"--name ollama ollama/ollama
docker build -t ollama-webui .
docker run -d-p 3000:8080 --add-host=host.docker.internal:host-gateway --name ollama-webui --restart always ollama-webui
```
### Using Docker 🐳
Your Ollama Web UI should now be hosted at [http://localhost:3000](http://localhost:3000). Enjoy! 😄
If Ollama is hosted on your local machine, run the following command:
### Accessing Ollama on a Different Server hosted over LAN (or Network)
#### Prerequisites
If you want to access an external Ollama Server hosted over LAN (or Network), for example, from your cloud server, run Ollama using the following command:
```bash
docker run -d-p 3000:8080 --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main
OLLAMA_HOST=0.0.0.0 OLLAMA_ORIGINS=* ollama serve
```
Alternatively, if you prefer to build the container yourself, use the following command:
In case you encounter any issues running the command and encounter errors, ensure to turn off any existing Ollama service that might be running in the background before retrying.