Commit fb77db30 authored by Timothy J. Baek's avatar Timothy J. Baek
Browse files

doc: external ollama server usage updated

parent 882f1732
...@@ -75,42 +75,44 @@ This command will install both Ollama and Ollama Web UI on your system. Ensure t ...@@ -75,42 +75,44 @@ This command will install both Ollama and Ollama Web UI on your system. Ensure t
After installing, verify that Ollama is running by accessing the following link in your web browser: [http://127.0.0.1:11434/](http://127.0.0.1:11434/). Note that the port number may differ based on your system configuration. After installing, verify that Ollama is running by accessing the following link in your web browser: [http://127.0.0.1:11434/](http://127.0.0.1:11434/). Note that the port number may differ based on your system configuration.
#### Accessing Ollama Web Interface over LAN ### Using Docker 🐳
If you want to access the Ollama web interface over LAN, for example, from your phone, run Ollama using the following command: If Ollama is hosted on your local machine, run the following command:
```bash ```bash
OLLAMA_HOST=0.0.0.0 OLLAMA_ORIGINS=* ollama serve docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main
``` ```
In case you encounter any issues running the command and encounter errors, ensure to turn off any existing Ollama service that might be running in the background before retrying. Alternatively, if you prefer to build the container yourself, use the following command:
If you're running Ollama via Docker:
```bash ```bash
docker run -d -v ollama:/root/.ollama -p 11434:11434 -e OLLAMA_ORIGINS="*" --name ollama ollama/ollama docker build -t ollama-webui .
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway --name ollama-webui --restart always ollama-webui
``` ```
### Using Docker 🐳 Your Ollama Web UI should now be hosted at [http://localhost:3000](http://localhost:3000). Enjoy! 😄
If Ollama is hosted on your local machine, run the following command: ### Accessing Ollama on a Different Server hosted over LAN (or Network)
#### Prerequisites
If you want to access an external Ollama Server hosted over LAN (or Network), for example, from your cloud server, run Ollama using the following command:
```bash ```bash
docker run -d -p 3000:8080 --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main OLLAMA_HOST=0.0.0.0 OLLAMA_ORIGINS=* ollama serve
``` ```
Alternatively, if you prefer to build the container yourself, use the following command: In case you encounter any issues running the command and encounter errors, ensure to turn off any existing Ollama service that might be running in the background before retrying.
If you're running Ollama via Docker:
```bash ```bash
docker build --build-arg OLLAMA_API_BASE_URL='' -t ollama-webui . docker run -d -v ollama:/root/.ollama -p 11434:11434 -e OLLAMA_ORIGINS="*" --name ollama ollama/ollama
docker run -d -p 3000:8080 --name ollama-webui --restart always ollama-webui
``` ```
Your Ollama Web UI should now be hosted at [http://localhost:3000](http://localhost:3000). Enjoy! 😄 #### Installing Ollama Web UI
#### Connecting to Ollama on a Different Server
If Ollama is hosted on a server other than your local machine, change `OLLAMA_API_BASE_URL` to match: Change `OLLAMA_API_BASE_URL` to match the external Ollama Server url:
```bash ```bash
docker build --build-arg OLLAMA_API_BASE_URL='https://example.com/api' -t ollama-webui . docker build --build-arg OLLAMA_API_BASE_URL='https://example.com/api' -t ollama-webui .
...@@ -119,38 +121,40 @@ docker run -d -p 3000:8080 --name ollama-webui --restart always ollama-webui ...@@ -119,38 +121,40 @@ docker run -d -p 3000:8080 --name ollama-webui --restart always ollama-webui
## How to Build for Static Deployment ## How to Build for Static Deployment
1. Install `node` 1. Clone & Enter the project
```sh ```sh
# Mac, Linux git clone https://github.com/ollama-webui/ollama-webui.git
curl https://webi.sh/node@lts | sh pushd ./ollama-webui/
source ~/.config/envman/PATH.env
``` ```
```pwsh 2. Create and edit `.env`
# Windows
curl.exe https://webi.ms/node@lts | powershell
```
2. Clone & Enter the project
```sh ```sh
git clone https://github.com/ollama-webui/ollama-webui.git cp -RPp example.env .env
pushd ./ollama-webui/
``` ```
3. Create and edit `.env`
3. Install node dependencies
```sh ```sh
cp -RPp example.env .env npm i
``` ```
4. Run in dev mode, or build the site for deployment 4. Run in dev mode, or build the site for deployment
- Test in Dev mode: - Test in Dev mode:
```sh ```sh
npm run dev npm run dev
``` ```
- Build for Deploy: \
(`PUBLIC_API_BASE_URL` will overwrite the value in `.env`) - Build for Deploy:
```sh ```sh
#`PUBLIC_API_BASE_URL` will overwrite the value in `.env`
PUBLIC_API_BASE_URL='https://example.com/api' npm run build PUBLIC_API_BASE_URL='https://example.com/api' npm run build
``` ```
5. Test the build with `caddy` (or the server of your choice) 5. Test the build with `caddy` (or the server of your choice)
```sh ```sh
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment