Unverified Commit 25e85c0e authored by Yasushiko's avatar Yasushiko Committed by GitHub
Browse files

Update TROUBLESHOOTING.md

parent 54555ce8
...@@ -46,7 +46,7 @@ docker run --platform linux/amd64 -d -p 3000:8080 -e OLLAMA_API_BASE_URL=http:// ...@@ -46,7 +46,7 @@ docker run --platform linux/amd64 -d -p 3000:8080 -e OLLAMA_API_BASE_URL=http://
``` ```
## Running ollama-webui as a container on WSL Ubuntu ## Running ollama-webui as a container on WSL Ubuntu
If you're running ollama-webui via docker on WSL Ubuntu and have chosen to install webui and ollama separately, you might encounter connection issues. This is often due to the docker container being unable to reach the Ollama server at 127.0.0.1:11434. To resolve this, you can use the `--network=host` flag in the docker command. If you're running ollama-webui via docker on WSL Ubuntu and have chosen to install webui and ollama separately, you might encounter connection issues. This is often due to the docker container being unable to reach the Ollama server at 127.0.0.1:11434. To resolve this, you can use the `--network=host` flag in the docker command. When done so port would be changed from 3000 to 8080, and the link would be: http://localhost:8080.
Here's an example of the command you should run: Here's an example of the command you should run:
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment