README.md 8.78 KB
Newer Older
Timothy J. Baek's avatar
Timothy J. Baek committed
1
# Ollama Web UI: A User-Friendly Web Interface for Chat Interactions 👋
Timothy J. Baek's avatar
Timothy J. Baek committed
2

Timothy J. Baek's avatar
Timothy J. Baek committed
3
4
5
![GitHub stars](https://img.shields.io/github/stars/ollama-webui/ollama-webui?style=social)
![GitHub forks](https://img.shields.io/github/forks/ollama-webui/ollama-webui?style=social)
![GitHub watchers](https://img.shields.io/github/watchers/ollama-webui/ollama-webui?style=social)
Timothy J. Baek's avatar
Timothy J. Baek committed
6
7
8
9
![GitHub repo size](https://img.shields.io/github/repo-size/ollama-webui/ollama-webui)
![GitHub language count](https://img.shields.io/github/languages/count/ollama-webui/ollama-webui)
![GitHub top language](https://img.shields.io/github/languages/top/ollama-webui/ollama-webui)
![GitHub last commit](https://img.shields.io/github/last-commit/ollama-webui/ollama-webui?color=red)
Timothy J. Baek's avatar
Timothy J. Baek committed
10
![Hits](https://hits.seeyoufarm.com/api/count/incr/badge.svg?url=https%3A%2F%2Fgithub.com%2Follama-webui%2Follama-wbui&count_bg=%2379C83D&title_bg=%23555555&icon=&icon_color=%23E7E7E7&title=hits&edge_flat=false)
Timothy Jaeryang Baek's avatar
Timothy Jaeryang Baek committed
11
[![Discord](https://img.shields.io/badge/Discord-Ollama_Web_UI-blue?logo=discord&logoColor=white)](https://discord.gg/5rJgQTnV4s)
Timothy J. Baek's avatar
Timothy J. Baek committed
12
[![](https://img.shields.io/static/v1?label=Sponsor&message=%E2%9D%A4&logo=GitHub&color=%23fe8e86)](https://github.com/sponsors/tjbck)
Timothy J. Baek's avatar
Timothy J. Baek committed
13

Timothy J. Baek's avatar
Timothy J. Baek committed
14
15
16
17
18
19
20
ChatGPT-Style Web Interface for Ollama 🦙

![Ollama Web UI Demo](./demo.gif)

## Features ⭐

- 🖥️ **Intuitive Interface**: Our chat interface takes inspiration from ChatGPT, ensuring a user-friendly experience.
21

Timothy J. Baek's avatar
Timothy J. Baek committed
22
- 📱 **Responsive Design**: Enjoy a seamless experience on both desktop and mobile devices.
23

Timothy J. Baek's avatar
Timothy J. Baek committed
24
-**Swift Responsiveness**: Enjoy fast and responsive performance.
25

Timothy J. Baek's avatar
Timothy J. Baek committed
26
- 🚀 **Effortless Setup**: Install seamlessly using Docker for a hassle-free experience.
27

Timothy J. Baek's avatar
Timothy J. Baek committed
28
29
30
31
- 💻 **Code Syntax Highlighting**: Enjoy enhanced code readability with our syntax highlighting feature.

- ✒️🔢 **Full Markdown and LaTeX Support**: Elevate your LLM experience with comprehensive Markdown and LaTeX capabilities for enriched interaction.

32
33
- 📥🗑️ **Download/Delete Models**: Easily download or remove models directly from the web UI.

Timothy J. Baek's avatar
Timothy J. Baek committed
34
- 🤖 **Multiple Model Support**: Seamlessly switch between different chat models for diverse interactions.
35

Timothy J. Baek's avatar
Timothy J. Baek committed
36
37
- ⚙️ **Many Models Conversations**: : Effortlessly engage with various models simultaneously, harnessing their unique strengths for optimal responses. Enhance your experience by leveraging a diverse set of models in parallel.

Timothy J. Baek's avatar
Timothy J. Baek committed
38
39
- 🤝 **OpenAI Model Integration**: Seamlessly utilize OpenAI models alongside Ollama models for a versatile conversational experience.

Timothy J. Baek's avatar
Timothy J. Baek committed
40
41
- 🔄 **Regeneration History Access**: Easily revisit and explore your entire regeneration history.

42
- 📜 **Chat History**: Effortlessly access and manage your conversation history.
43

Timothy J. Baek's avatar
Timothy J. Baek committed
44
- 📤📥 **Import/Export Chat History**: Seamlessly move your chat data in and out of the platform.
45

Timothy J. Baek's avatar
Timothy J. Baek committed
46
- 🗣️ **Voice Input Support**: Engage with your model through voice interactions; enjoy the convenience of talking to your model directly. Additionally, explore the option for sending voice input automatically after 3 seconds of silence for a streamlined experience.
Timothy J. Baek's avatar
Timothy J. Baek committed
47

Timothy J. Baek's avatar
Timothy J. Baek committed
48
- ⚙️ **Fine-Tuned Control with Advanced Parameters**: Gain a deeper level of control by adjusting parameters such as temperature and defining your system prompts to tailor the conversation to your specific preferences and needs.
49

Timothy J. Baek's avatar
Timothy J. Baek committed
50
51
- 🔐 **Auth Header Support**: Effortlessly enhance security by adding Authorization headers to Ollama requests directly from the web UI settings, ensuring access to secured Ollama servers.

Timothy J. Baek's avatar
Timothy J. Baek committed
52
- 🔗 **External Ollama Server Connection**: Seamlessly link to an external Ollama server hosted on a different address by configuring the environment variable during the Docker build phase. Additionally, you can also set the external server connection URL from the web UI post-build.
53

Timothy J. Baek's avatar
Timothy J. Baek committed
54
- 🔒 **Backend Reverse Proxy Support**: Strengthen security by enabling direct communication between Ollama Web UI backend and Ollama, eliminating the need to expose Ollama over LAN.
55

Timothy J. Baek's avatar
Timothy J. Baek committed
56
57
58
59
- 🌟 **Continuous Updates**: We are committed to improving Ollama Web UI with regular updates and new features.

## How to Install 🚀

60
### Installing Both Ollama and Ollama Web UI Using Docker Compose
Timothy J. Baek's avatar
Timothy J. Baek committed
61

62
If you don't have Ollama installed yet, you can use the provided Docker Compose file for a hassle-free installation. Simply run the following command:
Timothy Jaeryang Baek's avatar
Timothy Jaeryang Baek committed
63
64
65
66
67
68
69

```bash
docker compose up --build
```

This command will install both Ollama and Ollama Web UI on your system. Ensure to modify the `compose.yaml` file for GPU support if needed.

70
71
72
73
74
75
76
### Installing Ollama Web UI Only

#### Prerequisites

Make sure you have the latest version of Ollama installed before proceeding with the installation. You can find the latest version of Ollama at [https://ollama.ai/](https://ollama.ai/).

##### Checking Ollama
Timothy J. Baek's avatar
Timothy J. Baek committed
77

78
After installing Ollama, verify that Ollama is running by accessing the following link in your web browser: [http://127.0.0.1:11434/](http://127.0.0.1:11434/). Note that the port number may differ based on your system configuration.
Timothy J. Baek's avatar
Timothy J. Baek committed
79

80
#### Using Docker 🐳
Timothy J. Baek's avatar
Timothy J. Baek committed
81

82
If Ollama is hosted on your local machine and accessible at [http://127.0.0.1:11434/](http://127.0.0.1:11434/), run the following command:
Timothy J. Baek's avatar
Timothy J. Baek committed
83
84

```bash
85
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main
Timothy J. Baek's avatar
Timothy J. Baek committed
86
87
```

88
Alternatively, if you prefer to build the container yourself, use the following command:
Timothy J. Baek's avatar
Timothy J. Baek committed
89
90

```bash
91
92
docker build -t ollama-webui .
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway --name ollama-webui --restart always ollama-webui
Timothy J. Baek's avatar
Timothy J. Baek committed
93
94
```

95
Your Ollama Web UI should now be hosted at [http://localhost:3000](http://localhost:3000) and accessible over LAN (or Network). Enjoy! 😄
Timothy J. Baek's avatar
Timothy J. Baek committed
96

97
#### Accessing External Ollama on a Different Server
98

Timothy J. Baek's avatar
Timothy J. Baek committed
99
Change `OLLAMA_API_BASE_URL` environment variable to match the external Ollama Server url:
Timothy J. Baek's avatar
Timothy J. Baek committed
100
101

```bash
Timothy J. Baek's avatar
Timothy J. Baek committed
102
103
104
105
106
107
108
109
docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main
```

Alternatively, if you prefer to build the container yourself, use the following command:

```bash
docker build -t ollama-webui .
docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api --name ollama-webui --restart always ollama-webui
Timothy J. Baek's avatar
Timothy J. Baek committed
110
111
```

112
113
## How to Build for Static Deployment

114
1. Clone & Enter the project
115
116

   ```sh
117
118
   git clone https://github.com/ollama-webui/ollama-webui.git
   pushd ./ollama-webui/
119
120
   ```

121
2. Create and edit `.env`
122
123

   ```sh
124
   cp -RPp example.env .env
125
   ```
126
127
128

3. Install node dependencies

129
   ```sh
130
   npm i
131
   ```
132

133
4. Run in dev mode, or build the site for deployment
134

135
   - Test in Dev mode:
136

137
138
139
     ```sh
     npm run dev
     ```
140
141
142

   - Build for Deploy:

143
     ```sh
144
     #`PUBLIC_API_BASE_URL` will overwrite the value in `.env`
145
146
     PUBLIC_API_BASE_URL='https://example.com/api' npm run build
     ```
147

148
149
150
151
152
153
154
155
156
5. Test the build with `caddy` (or the server of your choice)

   ```sh
   curl https://webi.sh/caddy | sh

   PUBLIC_API_BASE_URL='https://localhost/api' npm run build
   caddy run --envfile .env --config ./Caddyfile.localhost
   ```

157
158
## Troubleshooting

159
See [TROUBLESHOOTING.md](/TROUBLESHOOTING.md) for information on how to troubleshoot and/or join our [Ollama Web UI Discord community](https://discord.gg/5rJgQTnV4s).
160

Timothy J. Baek's avatar
Timothy J. Baek committed
161
162
163
164
165
166
## What's Next? 🚀

### To-Do List 📝

Here are some exciting tasks on our to-do list:

Timothy J. Baek's avatar
Timothy J. Baek committed
167
- 🔐 **Access Control**: Securely manage requests to Ollama by utilizing the backend as a reverse proxy gateway, ensuring only authenticated users can send specific requests.
Timothy J. Baek's avatar
Timothy J. Baek committed
168
169
- 🧪 **Research-Centric Features**: Empower researchers in the fields of LLM and HCI with a comprehensive web UI for conducting user studies. Stay tuned for ongoing feature enhancements (e.g., surveys, analytics, and participant tracking) to facilitate their research.
- 📈 **User Study Tools**: Providing specialized tools, like heat maps and behavior tracking modules, to empower researchers in capturing and analyzing user behavior patterns with precision and accuracy.
Timothy J. Baek's avatar
Timothy J. Baek committed
170
171
172
173
- 📚 **Enhanced Documentation**: Elevate your setup and customization experience with improved, comprehensive documentation.

Feel free to contribute and help us make Ollama Web UI even better! 🙌

174
175
176
## Supporters ✨

A big shoutout to our amazing supporters who's helping to make this project possible! 🙏
Timothy J. Baek's avatar
Timothy J. Baek committed
177

178
### Platinum Sponsors 🤍
Timothy J. Baek's avatar
Timothy J. Baek committed
179

180
- [Prof. Lawrence Kim @ SFU](https://www.lhkim.com/)
Timothy J. Baek's avatar
Timothy J. Baek committed
181
182
183

## License 📜

Timothy J. Baek's avatar
Timothy J. Baek committed
184
This project is licensed under the [MIT License](LICENSE) - see the [LICENSE](LICENSE) file for details. 📄
Timothy J. Baek's avatar
Timothy J. Baek committed
185
186
187

## Support 💬

Timothy J. Baek's avatar
Timothy J. Baek committed
188
If you have any questions, suggestions, or need assistance, please open an issue or join our
Timothy Jaeryang Baek's avatar
Timothy Jaeryang Baek committed
189
[Ollama Web UI Discord community](https://discord.gg/5rJgQTnV4s) or
Timothy J. Baek's avatar
Timothy J. Baek committed
190
[Ollama Discord community](https://discord.gg/ollama) to connect with us! 🤝
Timothy J. Baek's avatar
Timothy J. Baek committed
191
192
193

---

194
Created by [Timothy J. Baek](https://github.com/tjbck) - Let's make Ollama Web UI even more amazing together! 💪