"tools/git@developer.sourcefind.cn:OpenDAS/nni.git" did not exist on "48f8b0526fca88ae9d0148b02ad918a0ba9bce9a"
Unverified Commit eadbfeb2 authored by Timothy Jaeryang Baek's avatar Timothy Jaeryang Baek Committed by GitHub
Browse files

Merge branch 'main' into dev

parents ac34a797 295ebb4f
...@@ -45,7 +45,7 @@ Also check our sibling project, [OllamaHub](https://ollamahub.com/), where you c ...@@ -45,7 +45,7 @@ Also check our sibling project, [OllamaHub](https://ollamahub.com/), where you c
- ⚙️ **Many Models Conversations**: Effortlessly engage with various models simultaneously, harnessing their unique strengths for optimal responses. Enhance your experience by leveraging a diverse set of models in parallel. - ⚙️ **Many Models Conversations**: Effortlessly engage with various models simultaneously, harnessing their unique strengths for optimal responses. Enhance your experience by leveraging a diverse set of models in parallel.
- 🤝 **OpenAI Model Integration**: Seamlessly utilize OpenAI models alongside Ollama models for a versatile conversational experience. - 🤝 **OpenAI API Integration**: Effortlessly integrate OpenAI-compatible API for versatile conversations alongside Ollama models. Customize the API Base URL to link with **LMStudio, Mistral, OpenRouter, and more**.
- 🔄 **Regeneration History Access**: Easily revisit and explore your entire regeneration history. - 🔄 **Regeneration History Access**: Easily revisit and explore your entire regeneration history.
...@@ -79,7 +79,19 @@ If you don't have Ollama installed yet, you can use the provided Docker Compose ...@@ -79,7 +79,19 @@ If you don't have Ollama installed yet, you can use the provided Docker Compose
docker compose up -d --build docker compose up -d --build
``` ```
This command will install both Ollama and Ollama Web UI on your system. Ensure to modify the `compose.yaml` file for GPU support and Exposing Ollama API outside the container stack if needed. This command will install both Ollama and Ollama Web UI on your system.
#### Enable GPU
Use the additional Docker Compose file designed to enable GPU support by running the following command:
```bash
docker compose -f docker-compose.yml -f docker-compose.gpu.yml up -d --build
```
#### Expose Ollama API outside the container stack
Deploy the service with an additional Docker Compose file designed for API exposure:
```bash
docker compose -f docker-compose.yml -f docker-compose.api.yml up -d --build
```
### Installing Ollama Web UI Only ### Installing Ollama Web UI Only
......
version: '3.6'
services:
ollama:
# Expose Ollama API outside the container stack
ports:
- 11434:11434
\ No newline at end of file
version: '3.6'
services:
ollama:
# GPU support
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: 1
capabilities:
- gpu
...@@ -2,20 +2,8 @@ version: '3.6' ...@@ -2,20 +2,8 @@ version: '3.6'
services: services:
ollama: ollama:
# Uncomment below for GPU support
# deploy:
# resources:
# reservations:
# devices:
# - driver: nvidia
# count: 1
# capabilities:
# - gpu
volumes: volumes:
- ollama:/root/.ollama - ollama:/root/.ollama
# Uncomment below to expose Ollama API outside the container stack
# ports:
# - 11434:11434
container_name: ollama container_name: ollama
pull_policy: always pull_policy: always
tty: true tty: true
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment