README.md 15.9 KB
Newer Older
Timothy J. Baek's avatar
Timothy J. Baek committed
1
# Ollama Web UI: A User-Friendly Web Interface for Chat Interactions 👋
Timothy J. Baek's avatar
Timothy J. Baek committed
2

Timothy J. Baek's avatar
Timothy J. Baek committed
3
4
5
![GitHub stars](https://img.shields.io/github/stars/ollama-webui/ollama-webui?style=social)
![GitHub forks](https://img.shields.io/github/forks/ollama-webui/ollama-webui?style=social)
![GitHub watchers](https://img.shields.io/github/watchers/ollama-webui/ollama-webui?style=social)
Timothy J. Baek's avatar
Timothy J. Baek committed
6
7
8
9
![GitHub repo size](https://img.shields.io/github/repo-size/ollama-webui/ollama-webui)
![GitHub language count](https://img.shields.io/github/languages/count/ollama-webui/ollama-webui)
![GitHub top language](https://img.shields.io/github/languages/top/ollama-webui/ollama-webui)
![GitHub last commit](https://img.shields.io/github/last-commit/ollama-webui/ollama-webui?color=red)
Timothy J. Baek's avatar
Timothy J. Baek committed
10
![Hits](https://hits.seeyoufarm.com/api/count/incr/badge.svg?url=https%3A%2F%2Fgithub.com%2Follama-webui%2Follama-wbui&count_bg=%2379C83D&title_bg=%23555555&icon=&icon_color=%23E7E7E7&title=hits&edge_flat=false)
Timothy Jaeryang Baek's avatar
Timothy Jaeryang Baek committed
11
[![Discord](https://img.shields.io/badge/Discord-Ollama_Web_UI-blue?logo=discord&logoColor=white)](https://discord.gg/5rJgQTnV4s)
Timothy J. Baek's avatar
Timothy J. Baek committed
12
[![](https://img.shields.io/static/v1?label=Sponsor&message=%E2%9D%A4&logo=GitHub&color=%23fe8e86)](https://github.com/sponsors/tjbck)
Timothy J. Baek's avatar
Timothy J. Baek committed
13

Timothy J. Baek's avatar
Timothy J. Baek committed
14
15
ChatGPT-Style Web Interface for Ollama 🦙

Timothy J. Baek's avatar
Timothy J. Baek committed
16
**Disclaimer:** _ollama-webui is a community-driven project and is not affiliated with the Ollama team in any way. This initiative is independent, and any inquiries or feedback should be directed to [our community on Discord](https://discord.gg/5rJgQTnV4s). We kindly request users to refrain from contacting or harassing the Ollama team regarding this project._
Timothy Jaeryang Baek's avatar
Timothy Jaeryang Baek committed
17

Timothy J. Baek's avatar
Timothy J. Baek committed
18
19
![Ollama Web UI Demo](./demo.gif)

Timothy Jaeryang Baek's avatar
Timothy Jaeryang Baek committed
20
21
Also check our sibling project, [OllamaHub](https://ollamahub.com/), where you can discover, download, and explore customized Modelfiles for Ollama! 🦙🔍

Timothy J. Baek's avatar
Timothy J. Baek committed
22
23
24
## Features ⭐

- 🖥️ **Intuitive Interface**: Our chat interface takes inspiration from ChatGPT, ensuring a user-friendly experience.
25

Timothy J. Baek's avatar
Timothy J. Baek committed
26
- 📱 **Responsive Design**: Enjoy a seamless experience on both desktop and mobile devices.
27

Timothy J. Baek's avatar
Timothy J. Baek committed
28
-**Swift Responsiveness**: Enjoy fast and responsive performance.
29

Daniele Viti's avatar
Daniele Viti committed
30
- 🚀 **Effortless Setup**: Install seamlessly using Docker or Kubernetes (kubectl, kustomize or helm) for a hassle-free experience.
31

Timothy J. Baek's avatar
Timothy J. Baek committed
32
33
34
35
- 💻 **Code Syntax Highlighting**: Enjoy enhanced code readability with our syntax highlighting feature.

- ✒️🔢 **Full Markdown and LaTeX Support**: Elevate your LLM experience with comprehensive Markdown and LaTeX capabilities for enriched interaction.

Timothy J. Baek's avatar
Timothy J. Baek committed
36
- 📚 **Local RAG Integration**: Dive into the future of chat interactions with the groundbreaking Retrieval Augmented Generation (RAG) support. This feature seamlessly integrates document interactions into your chat experience. You can load documents directly into the chat or add files to your document library, effortlessly accessing them using '#' command in the prompt. In its alpha phase, occasional issues may arise as we actively refine and enhance this feature to ensure optimal performance and reliability.
Timothy J. Baek's avatar
Timothy J. Baek committed
37

Timothy Jaeryang Baek's avatar
Timothy Jaeryang Baek committed
38
- 📜 **Prompt Preset Support**: Instantly access preset prompts using the '/' command in the chat input. Load predefined conversation starters effortlessly and expedite your interactions. Effortlessly import prompts through [OllamaHub](https://ollamahub.com/) integration.
Timothy Jaeryang Baek's avatar
Timothy Jaeryang Baek committed
39

Timothy Jaeryang Baek's avatar
Timothy Jaeryang Baek committed
40
41
- 👍👎 **RLHF Annotation**: Empower your messages by rating them with thumbs up and thumbs down, facilitating the creation of datasets for Reinforcement Learning from Human Feedback (RLHF). Utilize your messages to train or fine-tune models, all while ensuring the confidentiality of locally saved data.

42
43
- 📥🗑️ **Download/Delete Models**: Easily download or remove models directly from the web UI.

Timothy J. Baek's avatar
Timothy J. Baek committed
44
- ⬆️ **GGUF File Model Creation**: Effortlessly create Ollama models by uploading GGUF files directly from the web UI. Streamlined process with options to upload from your machine or download GGUF files from Hugging Face.
Timothy J. Baek's avatar
Timothy J. Baek committed
45

Timothy J. Baek's avatar
Timothy J. Baek committed
46
- 🤖 **Multiple Model Support**: Seamlessly switch between different chat models for diverse interactions.
47

Timothy J. Baek's avatar
Timothy J. Baek committed
48
49
- 🔄 **Multi-Modal Support**: Seamlessly engage with models that support multimodal interactions, including images (e.g., LLava).

50
- 🧩 **Modelfile Builder**: Easily create Ollama modelfiles via the web UI. Create and add characters/agents, customize chat elements, and import modelfiles effortlessly through [OllamaHub](https://ollamahub.com/) integration.
Timothy Jaeryang Baek's avatar
Timothy Jaeryang Baek committed
51

Timothy J. Baek's avatar
Timothy J. Baek committed
52
- ⚙️ **Many Models Conversations**: Effortlessly engage with various models simultaneously, harnessing their unique strengths for optimal responses. Enhance your experience by leveraging a diverse set of models in parallel.
Timothy J. Baek's avatar
Timothy J. Baek committed
53

Timothy Jaeryang Baek's avatar
Timothy Jaeryang Baek committed
54
- 🤝 **OpenAI API Integration**: Effortlessly integrate OpenAI-compatible API for versatile conversations alongside Ollama models. Customize the API Base URL to link with **LMStudio, Mistral, OpenRouter, and more**.
Timothy J. Baek's avatar
Timothy J. Baek committed
55

Timothy J. Baek's avatar
Timothy J. Baek committed
56
57
- 🔄 **Regeneration History Access**: Easily revisit and explore your entire regeneration history.

58
- 📜 **Chat History**: Effortlessly access and manage your conversation history.
59

Timothy J. Baek's avatar
Timothy J. Baek committed
60
- 📤📥 **Import/Export Chat History**: Seamlessly move your chat data in and out of the platform.
61

Timothy J. Baek's avatar
Timothy J. Baek committed
62
- 🗣️ **Voice Input Support**: Engage with your model through voice interactions; enjoy the convenience of talking to your model directly. Additionally, explore the option for sending voice input automatically after 3 seconds of silence for a streamlined experience.
Timothy J. Baek's avatar
Timothy J. Baek committed
63

Timothy J. Baek's avatar
Timothy J. Baek committed
64
- ⚙️ **Fine-Tuned Control with Advanced Parameters**: Gain a deeper level of control by adjusting parameters such as temperature and defining your system prompts to tailor the conversation to your specific preferences and needs.
65

Timothy J. Baek's avatar
Timothy J. Baek committed
66
- 🔗 **External Ollama Server Connection**: Seamlessly link to an external Ollama server hosted on a different address by configuring the environment variable.
Timothy J. Baek's avatar
Timothy J. Baek committed
67

Timothy J. Baek's avatar
Timothy J. Baek committed
68
- 🔐 **Role-Based Access Control (RBAC)**: Ensure secure access with restricted permissions; only authorized individuals can access your Ollama, and exclusive model creation/pulling rights are reserved for administrators.
69

Timothy J. Baek's avatar
Timothy J. Baek committed
70
- 🔒 **Backend Reverse Proxy Support**: Bolster security through direct communication between Ollama Web UI backend and Ollama. This key feature eliminates the need to expose Ollama over LAN. Requests made to the '/ollama/api' route from the web UI are seamlessly redirected to Ollama from the backend, enhancing overall system security.
71

Timothy J. Baek's avatar
Timothy J. Baek committed
72
73
- 🌟 **Continuous Updates**: We are committed to improving Ollama Web UI with regular updates and new features.

74
## 🔗 Also Check Out OllamaHub!
Timothy Jaeryang Baek's avatar
Timothy Jaeryang Baek committed
75
76
77

Don't forget to explore our sibling project, [OllamaHub](https://ollamahub.com/), where you can discover, download, and explore customized Modelfiles. OllamaHub offers a wide range of exciting possibilities for enhancing your chat interactions with Ollama! 🚀

Timothy J. Baek's avatar
Timothy J. Baek committed
78
79
## How to Install 🚀

Timothy J. Baek's avatar
Timothy J. Baek committed
80
🌟 **Important Note on User Roles and Privacy:**
Timothy J. Baek's avatar
Timothy J. Baek committed
81
82

- **Admin Creation:** The very first account to sign up on the Ollama Web UI will be granted **Administrator privileges**. This account will have comprehensive control over the platform, including user management and system settings.
Timothy J. Baek's avatar
Timothy J. Baek committed
83

Timothy J. Baek's avatar
Timothy J. Baek committed
84
85
- **User Registrations:** All subsequent users signing up will initially have their accounts set to **Pending** status by default. These accounts will require approval from the Administrator to gain access to the platform functionalities.

Timothy J. Baek's avatar
Timothy J. Baek committed
86
87
- **Privacy and Data Security:** We prioritize your privacy and data security above all. Please be reassured that all data entered into the Ollama Web UI is stored locally on your device. Our system is designed to be privacy-first, ensuring that no external requests are made, and your data does not leave your local environment. We are committed to maintaining the highest standards of data privacy and security, ensuring that your information remains confidential and under your control.

Timothy J. Baek's avatar
Timothy J. Baek committed
88
### Installing Ollama Web UI Only
Timothy J. Baek's avatar
Timothy J. Baek committed
89

Timothy J. Baek's avatar
Timothy J. Baek committed
90
#### Prerequisites
Timothy Jaeryang Baek's avatar
Timothy Jaeryang Baek committed
91

Timothy J. Baek's avatar
Timothy J. Baek committed
92
Make sure you have the latest version of Ollama installed before proceeding with the installation. You can find the latest version of Ollama at [https://ollama.ai/](https://ollama.ai/).
Timothy Jaeryang Baek's avatar
Timothy Jaeryang Baek committed
93

Timothy J. Baek's avatar
Timothy J. Baek committed
94
##### Checking Ollama
Timothy J. Baek's avatar
Timothy J. Baek committed
95

Timothy J. Baek's avatar
Timothy J. Baek committed
96
After installing Ollama, verify that Ollama is running by accessing the following link in your web browser: [http://127.0.0.1:11434/](http://127.0.0.1:11434/). Note that the port number may differ based on your system configuration.
97

Timothy J. Baek's avatar
Timothy J. Baek committed
98
#### Using Docker 🐳
99

Timothy J. Baek's avatar
Timothy J. Baek committed
100
101
102
**Important:** When using Docker to install Ollama Web UI, make sure to include the `-v ollama-webui:/app/backend/data` in your Docker command. This step is crucial as it ensures your database is properly mounted and prevents any loss of data.

If Ollama is hosted on your local machine and accessible at [http://127.0.0.1:11434/](http://127.0.0.1:11434/), run the following command:
103

Daniele Viti's avatar
Daniele Viti committed
104
```bash
Timothy J. Baek's avatar
Timothy J. Baek committed
105
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v ollama-webui:/app/backend/data --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main
Daniele Viti's avatar
Daniele Viti committed
106
```
Timothy J. Baek's avatar
Timothy J. Baek committed
107
108
109

Alternatively, if you prefer to build the container yourself, use the following command:

Daniele Viti's avatar
Daniele Viti committed
110
```bash
Timothy J. Baek's avatar
Timothy J. Baek committed
111
112
docker build -t ollama-webui .
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v ollama-webui:/app/backend/data --name ollama-webui --restart always ollama-webui
Daniele Viti's avatar
Daniele Viti committed
113
```
Timothy J. Baek's avatar
Timothy J. Baek committed
114

Timothy J. Baek's avatar
Timothy J. Baek committed
115
116
117
118
119
120
Your Ollama Web UI should now be hosted at [http://localhost:3000](http://localhost:3000) and accessible over LAN (or Network). Enjoy! 😄

#### Accessing External Ollama on a Different Server

Change `OLLAMA_API_BASE_URL` environment variable to match the external Ollama Server url:

Daniele Viti's avatar
Daniele Viti committed
121
```bash
Timothy J. Baek's avatar
Timothy J. Baek committed
122
docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v ollama-webui:/app/backend/data --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main
Daniele Viti's avatar
Daniele Viti committed
123
```
Timothy J. Baek's avatar
Timothy J. Baek committed
124

Timothy J. Baek's avatar
Timothy J. Baek committed
125
126
Alternatively, if you prefer to build the container yourself, use the following command:

127
```bash
Timothy J. Baek's avatar
Timothy J. Baek committed
128
129
docker build -t ollama-webui .
docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v ollama-webui:/app/backend/data --name ollama-webui --restart always ollama-webui
130
```
Timothy J. Baek's avatar
Timothy J. Baek committed
131
132
133
134
135
136
137

### Installing Both Ollama and Ollama Web UI

#### Using Docker Compose

If you don't have Ollama installed yet, you can use the provided Docker Compose file for a hassle-free installation. Simply run the following command:

Daniele Viti's avatar
Daniele Viti committed
138
```bash
Timothy J. Baek's avatar
Timothy J. Baek committed
139
docker compose up -d --build
Daniele Viti's avatar
Daniele Viti committed
140
141
```

Timothy J. Baek's avatar
Timothy J. Baek committed
142
This command will install both Ollama and Ollama Web UI on your system.
Timothy Jaeryang Baek's avatar
Timothy Jaeryang Baek committed
143

Timothy J. Baek's avatar
Timothy J. Baek committed
144
##### Enable GPU
145

Timothy J. Baek's avatar
Timothy J. Baek committed
146
Use the additional Docker Compose file designed to enable GPU support by running the following command:
147

Timothy J. Baek's avatar
Timothy J. Baek committed
148
149
150
```bash
docker compose -f docker-compose.yaml -f docker-compose.gpu.yaml up -d --build
```
151

Timothy J. Baek's avatar
Timothy J. Baek committed
152
##### Expose Ollama API outside the container stack
Timothy J. Baek's avatar
Timothy J. Baek committed
153

Timothy J. Baek's avatar
Timothy J. Baek committed
154
Deploy the service with an additional Docker Compose file designed for API exposure:
Timothy J. Baek's avatar
Timothy J. Baek committed
155

Timothy J. Baek's avatar
Timothy J. Baek committed
156
157
158
```bash
docker compose -f docker-compose.yaml -f docker-compose.api.yaml up -d --build
```
Timothy J. Baek's avatar
Timothy J. Baek committed
159

Timothy J. Baek's avatar
Timothy J. Baek committed
160
#### Using Provided `run-compose.sh` Script (Linux)
161

Timothy J. Baek's avatar
Timothy J. Baek committed
162
163
164
Also available on Windows under any docker-enabled WSL2 linux distro (you have to enable it from Docker Desktop)

Simply run the following command to grant execute permission to script:
Timothy J. Baek's avatar
Timothy J. Baek committed
165
166

```bash
Timothy J. Baek's avatar
Timothy J. Baek committed
167
chmod +x run-compose.sh
Timothy J. Baek's avatar
Timothy J. Baek committed
168
169
```

Timothy J. Baek's avatar
Timothy J. Baek committed
170
##### For CPU only container
Timothy J. Baek's avatar
Timothy J. Baek committed
171
172

```bash
Timothy J. Baek's avatar
Timothy J. Baek committed
173
./run-compose.sh
Timothy J. Baek's avatar
Timothy J. Baek committed
174
175
```

Timothy J. Baek's avatar
Timothy J. Baek committed
176
##### Enable GPU
177

Timothy J. Baek's avatar
Timothy J. Baek committed
178
179
For GPU enabled container (to enable this you must have your gpu driver for docker, it mostly works with nvidia so this is the official install guide: [nvidia-container-toolkit](https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html))
Warning! A GPU-enabled installation has only been tested using linux and nvidia GPU, full functionalities are not guaranteed under Windows or Macos or using a different GPU
Timothy J. Baek's avatar
Timothy J. Baek committed
180
181

```bash
Timothy J. Baek's avatar
Timothy J. Baek committed
182
./run-compose.sh --enable-gpu
Timothy J. Baek's avatar
Timothy J. Baek committed
183
184
```

Timothy J. Baek's avatar
Timothy J. Baek committed
185
Note that both the above commands will use the latest production docker image in repository, to be able to build the latest local version you'll need to append the `--build` parameter, for example:
Timothy J. Baek's avatar
Timothy J. Baek committed
186
187

```bash
Timothy J. Baek's avatar
Timothy J. Baek committed
188
./run-compose.sh --enable-gpu --build
Timothy J. Baek's avatar
Timothy J. Baek committed
189
190
```

Timothy J. Baek's avatar
Timothy J. Baek committed
191
192
193
194
#### Using Alternative Methods (Kustomize or Helm)

See [INSTALLATION.md](/INSTALLATION.md) for information on how to install and/or join our [Ollama Web UI Discord community](https://discord.gg/5rJgQTnV4s).

195
## How to Install Without Docker
196

197
198
While we strongly recommend using our convenient Docker container installation for optimal support, we understand that some situations may require a non-Docker setup, especially for development purposes. Please note that non-Docker installations are not officially supported, and you might need to troubleshoot on your own.

Timothy J. Baek's avatar
Timothy J. Baek committed
199
### Project Components
200

Timothy J. Baek's avatar
Timothy J. Baek committed
201
202
The Ollama Web UI consists of two primary components: the frontend and the backend (which serves as a reverse proxy, handling static frontend files, and additional features). Both need to be running concurrently for the development environment.

ThatOneCalculator's avatar
ThatOneCalculator committed
203
> [!IMPORTANT]
204
> The backend is required for proper functionality
205

ThatOneCalculator's avatar
ThatOneCalculator committed
206
207
### Requirements 📦

208
209
- 🐰 [Bun](https://bun.sh) >= 1.0.21 or 🐢 [Node.js](https://nodejs.org/en) >= 20.10
- 🐍 [Python](https://python.org) >= 3.11
ThatOneCalculator's avatar
ThatOneCalculator committed
210
211

### Build and Install 🛠️
212
213
214
215
216
217
218

Run the following commands to install:

```sh
git clone https://github.com/ollama-webui/ollama-webui.git
cd ollama-webui/

219
# Copying required .env file
220
cp -RPp example.env .env
221

222
223
224
225
226
227
228
# Building Frontend Using Node
npm i
npm run build

# or Building Frontend Using Bun
# bun install
# bun run build
229

230
# Serving Frontend with the Backend
231
cd ./backend
232
pip install -r requirements.txt -U
233
234
sh start.sh
```
235

236
237
You should have the Ollama Web UI up and running at http://localhost:8080/. Enjoy! 😄

238
239
## Troubleshooting

240
See [TROUBLESHOOTING.md](/TROUBLESHOOTING.md) for information on how to troubleshoot and/or join our [Ollama Web UI Discord community](https://discord.gg/5rJgQTnV4s).
241

Timothy J. Baek's avatar
Timothy J. Baek committed
242
243
## What's Next? 🚀

Timothy Jaeryang Baek's avatar
Timothy Jaeryang Baek committed
244
### Roadmap 📝
Timothy J. Baek's avatar
Timothy J. Baek committed
245

Timothy Jaeryang Baek's avatar
Timothy Jaeryang Baek committed
246
Here are some exciting tasks on our roadmap:
Timothy J. Baek's avatar
Timothy J. Baek committed
247

Timothy J. Baek's avatar
Timothy J. Baek committed
248
249
- 🌐 **Web Browsing Capability**: Experience the convenience of seamlessly integrating web content directly into your chat. Easily browse and share information without leaving the conversation.
- 🔄 **Function Calling**: Empower your interactions by running code directly within the chat. Execute functions and commands effortlessly, enhancing the functionality of your conversations.
Timothy J. Baek's avatar
Timothy J. Baek committed
250
- ⚙️ **Custom Python Backend Actions**: Empower your Ollama Web UI by creating or downloading custom Python backend actions. Unleash the full potential of your web interface with tailored actions that suit your specific needs, enhancing functionality and versatility.
Timothy J. Baek's avatar
Timothy J. Baek committed
251
- 🧠 **Long-Term Memory**: Witness the power of persistent memory in our agents. Enjoy conversations that feel continuous as agents remember and reference past interactions, creating a more cohesive and personalized user experience.
Timothy J. Baek's avatar
Timothy J. Baek committed
252
253
- 🧪 **Research-Centric Features**: Empower researchers in the fields of LLM and HCI with a comprehensive web UI for conducting user studies. Stay tuned for ongoing feature enhancements (e.g., surveys, analytics, and participant tracking) to facilitate their research.
- 📈 **User Study Tools**: Providing specialized tools, like heat maps and behavior tracking modules, to empower researchers in capturing and analyzing user behavior patterns with precision and accuracy.
Timothy J. Baek's avatar
Timothy J. Baek committed
254
255
256
257
- 📚 **Enhanced Documentation**: Elevate your setup and customization experience with improved, comprehensive documentation.

Feel free to contribute and help us make Ollama Web UI even better! 🙌

258
259
260
## Supporters ✨

A big shoutout to our amazing supporters who's helping to make this project possible! 🙏
Timothy J. Baek's avatar
Timothy J. Baek committed
261

262
### Platinum Sponsors 🤍
Timothy J. Baek's avatar
Timothy J. Baek committed
263

264
265
266
267
268
- We're looking for Sponsors!

### Acknowledgments

Special thanks to [Prof. Lawrence Kim @ SFU](https://www.lhkim.com/) and [Prof. Nick Vincent @ SFU](https://www.nickmvincent.com/) for their invaluable support and guidance in shaping this project into a research endeavor. Grateful for your mentorship throughout the journey! 🙌
Timothy J. Baek's avatar
Timothy J. Baek committed
269
270
271

## License 📜

Timothy J. Baek's avatar
Timothy J. Baek committed
272
This project is licensed under the [MIT License](LICENSE) - see the [LICENSE](LICENSE) file for details. 📄
Timothy J. Baek's avatar
Timothy J. Baek committed
273
274
275

## Support 💬

Timothy J. Baek's avatar
Timothy J. Baek committed
276
If you have any questions, suggestions, or need assistance, please open an issue or join our
Timothy Jaeryang Baek's avatar
Timothy Jaeryang Baek committed
277
[Ollama Web UI Discord community](https://discord.gg/5rJgQTnV4s) or
Timothy J. Baek's avatar
Timothy J. Baek committed
278
[Ollama Discord community](https://discord.gg/ollama) to connect with us! 🤝
Timothy J. Baek's avatar
Timothy J. Baek committed
279
280
281

---

282
Created by [Timothy J. Baek](https://github.com/tjbck) - Let's make Ollama Web UI even more amazing together! 💪