@@ -398,7 +398,7 @@ See the [API documentation](./docs/api.md) for all endpoints.
...
@@ -398,7 +398,7 @@ See the [API documentation](./docs/api.md) for all endpoints.
-[aidful-ollama-model-delete](https://github.com/AidfulAI/aidful-ollama-model-delete)(User interface for simplified model cleanup)
-[aidful-ollama-model-delete](https://github.com/AidfulAI/aidful-ollama-model-delete)(User interface for simplified model cleanup)
-[Perplexica](https://github.com/ItzCrazyKns/Perplexica)(An AI-powered search engine & an open-source alternative to Perplexity AI)
-[Perplexica](https://github.com/ItzCrazyKns/Perplexica)(An AI-powered search engine & an open-source alternative to Perplexity AI)
-[Ollama Chat WebUI for Docker ](https://github.com/oslook/ollama-webui)(Support for local docker deployment, lightweight ollama webui)
-[Ollama Chat WebUI for Docker ](https://github.com/oslook/ollama-webui)(Support for local docker deployment, lightweight ollama webui)
-[AI Toolkit for Visual Studio Code](https://aka.ms/ai-tooklit/ollama-docs)(Microsoft-official VSCode extension to chat, test, evaluate models with Ollama support, and use them in your AI applications.)
-[AI Toolkit for Visual Studio Code](https://aka.ms/ai-tooklit/ollama-docs)(Microsoft-official VSCode extension to chat, test, evaluate models with Ollama support, and use them in your AI applications.)
-[MinimalNextOllamaChat](https://github.com/anilkay/MinimalNextOllamaChat)(Minimal Web UI for Chat and Model Control)
-[MinimalNextOllamaChat](https://github.com/anilkay/MinimalNextOllamaChat)(Minimal Web UI for Chat and Model Control)
-[Chipper](https://github.com/TilmanGriesel/chipper) AI interface for tinkerers (Ollama, Haystack RAG, Python)
-[Chipper](https://github.com/TilmanGriesel/chipper) AI interface for tinkerers (Ollama, Haystack RAG, Python)
-[ChibiChat](https://github.com/CosmicEventHorizon/ChibiChat)(Kotlin-based Android app to chat with Ollama and Koboldcpp API endpoints)
-[ChibiChat](https://github.com/CosmicEventHorizon/ChibiChat)(Kotlin-based Android app to chat with Ollama and Koboldcpp API endpoints)
@@ -223,7 +223,7 @@ Refer to the section [above](#how-do-i-configure-ollama-server) for how to set e
...
@@ -223,7 +223,7 @@ Refer to the section [above](#how-do-i-configure-ollama-server) for how to set e
## How can I use Ollama in Visual Studio Code?
## How can I use Ollama in Visual Studio Code?
There is already a large collection of plugins available for VSCode as well as other editors that leverage Ollama. See the list of [extensions & plugins](https://github.com/ollama/ollama#extensions--plugins) at the bottom of the main repository readme.
There is already a large collection of plugins available for VSCode as well as other editors that leverage Ollama. See the list of [extensions & plugins](https://github.com/ollama/ollama#extensions--plugins) at the bottom of the main repository readme.
## How do I use Ollama with GPU acceleration in Docker?
## How do I use Ollama with GPU acceleration in Docker?