Unverified Commit e3ce4cda authored by Timothy Jaeryang Baek's avatar Timothy Jaeryang Baek Committed by GitHub
Browse files

Merge branch 'main' into main

parents 9bbae0e2 6ea9f6e1
...@@ -13,6 +13,8 @@ ...@@ -13,6 +13,8 @@
ChatGPT-Style Web Interface for Ollama 🦙 ChatGPT-Style Web Interface for Ollama 🦙
**Disclaimer:** *ollama-webui is a community-driven project and is not affiliated with the Ollama team in any way. This initiative is independent, and any inquiries or feedback should be directed to [our community on Discord](https://discord.gg/5rJgQTnV4s). We kindly request users to refrain from contacting or harassing the Ollama team regarding this project.*
![Ollama Web UI Demo](./demo.gif) ![Ollama Web UI Demo](./demo.gif)
Also check our sibling project, [OllamaHub](https://ollamahub.com/), where you can discover, download, and explore customized Modelfiles for Ollama! 🦙🔍 Also check our sibling project, [OllamaHub](https://ollamahub.com/), where you can discover, download, and explore customized Modelfiles for Ollama! 🦙🔍
...@@ -130,6 +132,10 @@ docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api --name ...@@ -130,6 +132,10 @@ docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api --name
While we strongly recommend using our convenient Docker container installation for optimal support, we understand that some situations may require a non-Docker setup, especially for development purposes. Please note that non-Docker installations are not officially supported, and you might need to troubleshoot on your own. While we strongly recommend using our convenient Docker container installation for optimal support, we understand that some situations may require a non-Docker setup, especially for development purposes. Please note that non-Docker installations are not officially supported, and you might need to troubleshoot on your own.
**Warning: Backend Dependency for Proper Functionality**
In order to ensure the seamless operation of our application, it is crucial to run both the backend and frontend components simultaneously. Serving only the frontend in isolation is not supported and may lead to unpredictable outcomes, rendering the application inoperable. Attempting to raise an issue when solely serving the frontend will not be addressed, as it falls outside the intended usage. To achieve optimal results, please strictly adhere to the specified steps outlined in this documentation. Utilize the frontend solely for building static files, and subsequently run the complete application with the provided backend commands. Failure to follow these instructions may result in unsupported configurations, and we may not be able to provide assistance in such cases. Your cooperation in following the prescribed procedures is essential for a smooth user experience and effective issue resolution.
### TL;DR 🚀 ### TL;DR 🚀
Run the following commands to install: Run the following commands to install:
......
...@@ -45,6 +45,15 @@ Becomes ...@@ -45,6 +45,15 @@ Becomes
docker run --platform linux/amd64 -d -p 3000:8080 -e OLLAMA_API_BASE_URL=http://example.com:11434/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main docker run --platform linux/amd64 -d -p 3000:8080 -e OLLAMA_API_BASE_URL=http://example.com:11434/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main
``` ```
## Running ollama-webui as a container on WSL Ubuntu
If you're running ollama-webui via docker on WSL Ubuntu and have chosen to install webui and ollama separately, you might encounter connection issues. This is often due to the docker container being unable to reach the Ollama server at 127.0.0.1:11434. To resolve this, you can use the `--network=host` flag in the docker command. When done so port would be changed from 3000 to 8080, and the link would be: http://localhost:8080.
Here's an example of the command you should run:
```bash
docker run -d --network=host -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main
```
## References ## References
[Change Docker Desktop Settings on Mac](https://docs.docker.com/desktop/settings/mac/) Search for "x86" in that page. [Change Docker Desktop Settings on Mac](https://docs.docker.com/desktop/settings/mac/) Search for "x86" in that page.
......
...@@ -30,7 +30,7 @@ if ENV == "prod": ...@@ -30,7 +30,7 @@ if ENV == "prod":
# WEBUI_VERSION # WEBUI_VERSION
#################################### ####################################
WEBUI_VERSION = os.environ.get("WEBUI_VERSION", "v1.0.0-alpha.33") WEBUI_VERSION = os.environ.get("WEBUI_VERSION", "v1.0.0-alpha.35")
#################################### ####################################
# WEBUI_AUTH # WEBUI_AUTH
......
...@@ -2,6 +2,7 @@ ...@@ -2,6 +2,7 @@
import { settings } from '$lib/stores'; import { settings } from '$lib/stores';
import toast from 'svelte-french-toast'; import toast from 'svelte-french-toast';
import Suggestions from './MessageInput/Suggestions.svelte'; import Suggestions from './MessageInput/Suggestions.svelte';
import { onMount } from 'svelte';
export let submitPrompt: Function; export let submitPrompt: Function;
export let stopResponse: Function; export let stopResponse: Function;
...@@ -11,6 +12,7 @@ ...@@ -11,6 +12,7 @@
let filesInputElement; let filesInputElement;
let inputFiles; let inputFiles;
let dragged = false;
export let files = []; export let files = [];
...@@ -82,12 +84,78 @@ ...@@ -82,12 +84,78 @@
} }
} }
}; };
onMount(() => {
const dropZone = document.querySelector('body');
dropZone?.addEventListener('dragover', (e) => {
e.preventDefault();
dragged = true;
});
dropZone.addEventListener('drop', (e) => {
e.preventDefault();
console.log(e);
if (e.dataTransfer?.files) {
let reader = new FileReader();
reader.onload = (event) => {
files = [
...files,
{
type: 'image',
url: `${event.target.result}`
}
];
};
if (
e.dataTransfer?.files &&
e.dataTransfer?.files.length > 0 &&
['image/gif', 'image/jpeg', 'image/png'].includes(e.dataTransfer?.files[0]['type'])
) {
reader.readAsDataURL(e.dataTransfer?.files[0]);
} else {
toast.error(`Unsupported File Type '${e.dataTransfer?.files[0]['type']}'.`);
}
}
dragged = false;
});
dropZone?.addEventListener('dragleave', () => {
dragged = false;
});
});
</script> </script>
{#if dragged}
<div
class="fixed w-full h-full flex z-50 touch-none pointer-events-none"
id="dropzone"
role="region"
aria-label="Drag and Drop Container"
>
<div class="absolute rounded-xl w-full h-full backdrop-blur bg-gray-800/40 flex justify-center">
<div class="m-auto pt-64 flex flex-col justify-center">
<div class="max-w-md">
<div class=" text-center text-6xl mb-3">🏞️</div>
<div class="text-center dark:text-white text-2xl font-semibold z-50">Add Images</div>
<div class=" mt-2 text-center text-sm dark:text-gray-200 w-full">
Drop any images here to add to the conversation
</div>
</div>
</div>
</div>
</div>
{/if}
<div class="fixed bottom-0 w-full"> <div class="fixed bottom-0 w-full">
<div class="px-2.5 pt-2.5 -mb-0.5 mx-auto inset-x-0 bg-transparent flex justify-center"> <div class="px-2.5 pt-2.5 -mb-0.5 mx-auto inset-x-0 bg-transparent flex justify-center">
{#if messages.length == 0 && suggestionPrompts.length !== 0} {#if messages.length == 0 && suggestionPrompts.length !== 0}
<div class="max-w-3xl"> <div class="max-w-3xl w-full">
<Suggestions {suggestionPrompts} {submitPrompt} /> <Suggestions {suggestionPrompts} {submitPrompt} />
</div> </div>
{/if} {/if}
......
...@@ -3,7 +3,7 @@ ...@@ -3,7 +3,7 @@
export let suggestionPrompts = []; export let suggestionPrompts = [];
</script> </script>
<div class=" flex flex-wrap-reverse mb-3 md:p-1 text-left"> <div class=" flex flex-wrap-reverse mb-3 md:p-1 text-left w-full">
{#each suggestionPrompts as prompt, promptIdx} {#each suggestionPrompts as prompt, promptIdx}
<div class="{promptIdx > 1 ? 'hidden sm:inline-flex' : ''} basis-full sm:basis-1/2 p-[5px]"> <div class="{promptIdx > 1 ? 'hidden sm:inline-flex' : ''} basis-full sm:basis-1/2 p-[5px]">
<button <button
......
...@@ -362,9 +362,19 @@ ...@@ -362,9 +362,19 @@
<div class="m-auto text-center max-w-md pb-56 px-2"> <div class="m-auto text-center max-w-md pb-56 px-2">
<div class="flex justify-center mt-8"> <div class="flex justify-center mt-8">
{#if selectedModelfile && selectedModelfile.imageUrl} {#if selectedModelfile && selectedModelfile.imageUrl}
<img src={selectedModelfile?.imageUrl} class=" w-20 mb-2 rounded-full" /> <img
src={selectedModelfile?.imageUrl}
alt="modelfile"
class=" w-20 mb-2 rounded-full"
draggable="false"
/>
{:else} {:else}
<img src="/ollama.png" class=" w-16 invert-[10%] dark:invert-[100%] rounded-full" /> <img
src="/ollama.png"
class=" w-16 invert-[10%] dark:invert-[100%] rounded-full"
alt="ollama"
draggable="false"
/>
{/if} {/if}
</div> </div>
<div class=" mt-2 text-2xl text-gray-800 dark:text-gray-100 font-semibold"> <div class=" mt-2 text-2xl text-gray-800 dark:text-gray-100 font-semibold">
...@@ -401,12 +411,14 @@ ...@@ -401,12 +411,14 @@
src="{$settings.gravatarUrl ? $settings.gravatarUrl : '/user'}.png" src="{$settings.gravatarUrl ? $settings.gravatarUrl : '/user'}.png"
class=" max-w-[28px] object-cover rounded-full" class=" max-w-[28px] object-cover rounded-full"
alt="User profile" alt="User profile"
draggable="false"
/> />
{:else} {:else}
<img <img
src={$user ? $user.profile_image_url : '/user.png'} src={$user ? $user.profile_image_url : '/user.png'}
class=" max-w-[28px] object-cover rounded-full" class=" max-w-[28px] object-cover rounded-full"
alt="User profile" alt="User profile"
draggable="false"
/> />
{/if} {/if}
{:else if selectedModelfile} {:else if selectedModelfile}
...@@ -414,12 +426,14 @@ ...@@ -414,12 +426,14 @@
src={selectedModelfile?.imageUrl ?? '/favicon.png'} src={selectedModelfile?.imageUrl ?? '/favicon.png'}
class=" max-w-[28px] object-cover rounded-full" class=" max-w-[28px] object-cover rounded-full"
alt="Ollama profile" alt="Ollama profile"
draggable="false"
/> />
{:else} {:else}
<img <img
src="/favicon.png" src="/favicon.png"
class=" max-w-[28px] object-cover rounded-full" class=" max-w-[28px] object-cover rounded-full"
alt="Ollama profile" alt="Ollama profile"
draggable="false"
/> />
{/if} {/if}
</div> </div>
...@@ -469,7 +483,12 @@ ...@@ -469,7 +483,12 @@
{#each message.files as file} {#each message.files as file}
<div> <div>
{#if file.type === 'image'} {#if file.type === 'image'}
<img src={file.url} alt="input" class=" max-h-96 rounded-lg" /> <img
src={file.url}
alt="input"
class=" max-h-96 rounded-lg"
draggable="false"
/>
{/if} {/if}
</div> </div>
{/each} {/each}
......
...@@ -56,6 +56,7 @@ ...@@ -56,6 +56,7 @@
let gravatarEmail = ''; let gravatarEmail = '';
let OPENAI_API_KEY = ''; let OPENAI_API_KEY = '';
let OPENAI_API_BASE_URL = '';
// Auth // Auth
let authEnabled = false; let authEnabled = false;
...@@ -302,8 +303,10 @@ ...@@ -302,8 +303,10 @@
// If OpenAI API Key exists // If OpenAI API Key exists
if (type === 'all' && $settings.OPENAI_API_KEY) { if (type === 'all' && $settings.OPENAI_API_KEY) {
const API_BASE_URL = $settings.OPENAI_API_BASE_URL ?? 'https://api.openai.com/v1';
// Validate OPENAI_API_KEY // Validate OPENAI_API_KEY
const openaiModelRes = await fetch(`https://api.openai.com/v1/models`, { const openaiModelRes = await fetch(`${API_BASE_URL}/models`, {
method: 'GET', method: 'GET',
headers: { headers: {
'Content-Type': 'application/json', 'Content-Type': 'application/json',
...@@ -320,15 +323,19 @@ ...@@ -320,15 +323,19 @@
return null; return null;
}); });
const openAIModels = openaiModelRes?.data ?? null; const openAIModels = Array.isArray(openaiModelRes)
? openaiModelRes
: openaiModelRes?.data ?? null;
models.push( models.push(
...(openAIModels ...(openAIModels
? [ ? [
{ name: 'hr' }, { name: 'hr' },
...openAIModels ...openAIModels
.map((model) => ({ name: model.id, label: 'OpenAI' })) .map((model) => ({ name: model.id, external: true }))
.filter((model) => model.name.includes('gpt')) .filter((model) =>
API_BASE_URL.includes('openai') ? model.name.includes('gpt') : true
)
] ]
: []) : [])
); );
...@@ -363,6 +370,7 @@ ...@@ -363,6 +370,7 @@
gravatarEmail = settings.gravatarEmail ?? ''; gravatarEmail = settings.gravatarEmail ?? '';
OPENAI_API_KEY = settings.OPENAI_API_KEY ?? ''; OPENAI_API_KEY = settings.OPENAI_API_KEY ?? '';
OPENAI_API_BASE_URL = settings.OPENAI_API_BASE_URL ?? 'https://api.openai.com/v1';
authEnabled = settings.authHeader !== undefined ? true : false; authEnabled = settings.authHeader !== undefined ? true : false;
if (authEnabled) { if (authEnabled) {
...@@ -476,6 +484,30 @@ ...@@ -476,6 +484,30 @@
<div class=" self-center">Models</div> <div class=" self-center">Models</div>
</button> </button>
<button
class="px-2.5 py-2.5 min-w-fit rounded-lg flex-1 md:flex-none flex text-right transition {selectedTab ===
'external'
? 'bg-gray-200 dark:bg-gray-700'
: ' hover:bg-gray-300 dark:hover:bg-gray-800'}"
on:click={() => {
selectedTab = 'external';
}}
>
<div class=" self-center mr-2">
<svg
xmlns="http://www.w3.org/2000/svg"
viewBox="0 0 16 16"
fill="currentColor"
class="w-4 h-4"
>
<path
d="M1 9.5A3.5 3.5 0 0 0 4.5 13H12a3 3 0 0 0 .917-5.857 2.503 2.503 0 0 0-3.198-3.019 3.5 3.5 0 0 0-6.628 2.171A3.5 3.5 0 0 0 1 9.5Z"
/>
</svg>
</div>
<div class=" self-center">External</div>
</button>
<button <button
class="px-2.5 py-2.5 min-w-fit rounded-lg flex-1 md:flex-none flex text-right transition {selectedTab === class="px-2.5 py-2.5 min-w-fit rounded-lg flex-1 md:flex-none flex text-right transition {selectedTab ===
'addons' 'addons'
...@@ -859,14 +891,73 @@ ...@@ -859,14 +891,73 @@
</div> </div>
</div> </div>
</div> </div>
{:else if selectedTab === 'external'}
<form
class="flex flex-col h-full justify-between space-y-3 text-sm"
on:submit|preventDefault={() => {
saveSettings({
OPENAI_API_KEY: OPENAI_API_KEY !== '' ? OPENAI_API_KEY : undefined,
OPENAI_API_BASE_URL: OPENAI_API_BASE_URL !== '' ? OPENAI_API_BASE_URL : undefined
});
show = false;
}}
>
<div class=" space-y-3">
<div>
<div class=" mb-2.5 text-sm font-medium">OpenAI API Key</div>
<div class="flex w-full">
<div class="flex-1">
<input
class="w-full rounded py-2 px-4 text-sm dark:text-gray-300 dark:bg-gray-800 outline-none"
placeholder="Enter OpenAI API Key"
bind:value={OPENAI_API_KEY}
autocomplete="off"
/>
</div>
</div>
<div class="mt-2 text-xs text-gray-400 dark:text-gray-500">
Adds optional support for online models.
</div>
</div>
<hr class=" dark:border-gray-700" />
<div>
<div class=" mb-2.5 text-sm font-medium">OpenAI API Base URL</div>
<div class="flex w-full">
<div class="flex-1">
<input
class="w-full rounded py-2 px-4 text-sm dark:text-gray-300 dark:bg-gray-800 outline-none"
placeholder="Enter OpenAI API Key"
bind:value={OPENAI_API_BASE_URL}
autocomplete="off"
/>
</div>
</div>
<div class="mt-2 text-xs text-gray-400 dark:text-gray-500">
WebUI will make requests to <span class=" text-gray-200"
>'{OPENAI_API_BASE_URL}/chat'</span
>
</div>
</div>
</div>
<div class="flex justify-end pt-3 text-sm font-medium">
<button
class=" px-4 py-2 bg-emerald-600 hover:bg-emerald-700 text-gray-100 transition rounded"
type="submit"
>
Save
</button>
</div>
</form>
{:else if selectedTab === 'addons'} {:else if selectedTab === 'addons'}
<form <form
class="flex flex-col h-full justify-between space-y-3 text-sm" class="flex flex-col h-full justify-between space-y-3 text-sm"
on:submit|preventDefault={() => { on:submit|preventDefault={() => {
saveSettings({ saveSettings({
gravatarEmail: gravatarEmail !== '' ? gravatarEmail : undefined, gravatarEmail: gravatarEmail !== '' ? gravatarEmail : undefined,
gravatarUrl: gravatarEmail !== '' ? getGravatarURL(gravatarEmail) : undefined, gravatarUrl: gravatarEmail !== '' ? getGravatarURL(gravatarEmail) : undefined
OPENAI_API_KEY: OPENAI_API_KEY !== '' ? OPENAI_API_KEY : undefined
}); });
show = false; show = false;
}} }}
...@@ -962,26 +1053,6 @@ ...@@ -962,26 +1053,6 @@
> >
</div> </div>
</div> </div>
<hr class=" dark:border-gray-700" />
<div>
<div class=" mb-2.5 text-sm font-medium">
OpenAI API Key <span class=" text-gray-400 text-sm">(optional)</span>
</div>
<div class="flex w-full">
<div class="flex-1">
<input
class="w-full rounded py-2 px-4 text-sm dark:text-gray-300 dark:bg-gray-800 outline-none"
placeholder="Enter OpenAI API Key"
bind:value={OPENAI_API_KEY}
autocomplete="off"
/>
</div>
</div>
<div class="mt-2 text-xs text-gray-400 dark:text-gray-500">
Adds optional support for 'gpt-*' models available.
</div>
</div>
</div> </div>
<div class="flex justify-end pt-3 text-sm font-medium"> <div class="flex justify-end pt-3 text-sm font-medium">
......
...@@ -55,7 +55,9 @@ ...@@ -55,7 +55,9 @@
// If OpenAI API Key exists // If OpenAI API Key exists
if ($settings.OPENAI_API_KEY) { if ($settings.OPENAI_API_KEY) {
// Validate OPENAI_API_KEY // Validate OPENAI_API_KEY
const openaiModelRes = await fetch(`https://api.openai.com/v1/models`, {
const API_BASE_URL = $settings.OPENAI_API_BASE_URL ?? 'https://api.openai.com/v1';
const openaiModelRes = await fetch(`${API_BASE_URL}/models`, {
method: 'GET', method: 'GET',
headers: { headers: {
'Content-Type': 'application/json', 'Content-Type': 'application/json',
...@@ -72,15 +74,19 @@ ...@@ -72,15 +74,19 @@
return null; return null;
}); });
const openAIModels = openaiModelRes?.data ?? null; const openAIModels = Array.isArray(openaiModelRes)
? openaiModelRes
: openaiModelRes?.data ?? null;
models.push( models.push(
...(openAIModels ...(openAIModels
? [ ? [
{ name: 'hr' }, { name: 'hr' },
...openAIModels ...openAIModels
.map((model) => ({ name: model.id, label: 'OpenAI' })) .map((model) => ({ name: model.id, external: true }))
.filter((model) => model.name.includes('gpt')) .filter((model) =>
API_BASE_URL.includes('openai') ? model.name.includes('gpt') : true
)
] ]
: []) : [])
); );
...@@ -236,36 +242,39 @@ ...@@ -236,36 +242,39 @@
<div <div
class="absolute rounded-xl w-full h-full backdrop-blur bg-gray-900/60 flex justify-center" class="absolute rounded-xl w-full h-full backdrop-blur bg-gray-900/60 flex justify-center"
> >
<div class="m-auto pb-44"> <div class="m-auto pb-44 flex flex-col justify-center">
<div class="text-center dark:text-white text-2xl font-medium z-50"> <div class="max-w-md">
Ollama Update Required <div class="text-center dark:text-white text-2xl font-medium z-50">
</div> Connection Issue or Update Needed
</div>
<div class=" mt-4 text-center max-w-md text-sm dark:text-gray-200">
Oops! It seems like your Ollama needs a little attention. <br <div class=" mt-4 text-center text-sm dark:text-gray-200 w-full">
class=" hidden sm:flex" Oops! It seems like your Ollama needs a little attention. <br
/> class=" hidden sm:flex"
We encountered a connection issue or noticed that you're running an outdated version. Please />We've detected either a connection hiccup or observed that you're using an older
update to version. Ensure you're on the latest Ollama version
<span class=" dark:text-white font-medium">{requiredOllamaVersion} or above</span>. <br class=" hidden sm:flex" />(version
</div> <span class=" dark:text-white font-medium">{requiredOllamaVersion} or higher</span>)
or check your connection.
<div class=" mt-6 mx-auto relative group w-fit"> </div>
<button
class="relative z-20 flex px-5 py-2 rounded-full bg-gray-100 hover:bg-gray-200 transition font-medium text-sm" <div class=" mt-6 mx-auto relative group w-fit">
on:click={async () => { <button
await setOllamaVersion(await getOllamaVersion()); class="relative z-20 flex px-5 py-2 rounded-full bg-gray-100 hover:bg-gray-200 transition font-medium text-sm"
}} on:click={async () => {
> await setOllamaVersion(await getOllamaVersion());
Check Again }}
</button> >
Check Again
<button </button>
class="text-xs text-center w-full mt-2 text-gray-400 underline"
on:click={async () => { <button
await setOllamaVersion(requiredOllamaVersion); class="text-xs text-center w-full mt-2 text-gray-400 underline"
}}>Close</button on:click={async () => {
> await setOllamaVersion(requiredOllamaVersion);
}}>Close</button
>
</div>
</div> </div>
</div> </div>
</div> </div>
......
...@@ -7,7 +7,7 @@ ...@@ -7,7 +7,7 @@
import { splitStream } from '$lib/utils'; import { splitStream } from '$lib/utils';
import { goto } from '$app/navigation'; import { goto } from '$app/navigation';
import { config, modelfiles, user, settings, db, chats, chatId } from '$lib/stores'; import { config, models, modelfiles, user, settings, db, chats, chatId } from '$lib/stores';
import MessageInput from '$lib/components/chat/MessageInput.svelte'; import MessageInput from '$lib/components/chat/MessageInput.svelte';
import Messages from '$lib/components/chat/Messages.svelte'; import Messages from '$lib/components/chat/Messages.svelte';
...@@ -130,7 +130,8 @@ ...@@ -130,7 +130,8 @@
const sendPrompt = async (userPrompt, parentId, _chatId) => { const sendPrompt = async (userPrompt, parentId, _chatId) => {
await Promise.all( await Promise.all(
selectedModels.map(async (model) => { selectedModels.map(async (model) => {
if (model.includes('gpt-')) { console.log(model);
if ($models.filter((m) => m.name === model)[0].external) {
await sendPromptOpenAI(model, userPrompt, parentId, _chatId); await sendPromptOpenAI(model, userPrompt, parentId, _chatId);
} else { } else {
await sendPromptOllama(model, userPrompt, parentId, _chatId); await sendPromptOllama(model, userPrompt, parentId, _chatId);
...@@ -364,132 +365,162 @@ ...@@ -364,132 +365,162 @@
]; ];
} }
await tick();
window.scrollTo({ top: document.body.scrollHeight }); window.scrollTo({ top: document.body.scrollHeight });
const res = await fetch(`https://api.openai.com/v1/chat/completions`, { const res = await fetch(
method: 'POST', `${$settings.OPENAI_API_BASE_URL ?? 'https://api.openai.com/v1'}/chat/completions`,
headers: { {
'Content-Type': 'application/json', method: 'POST',
Authorization: `Bearer ${$settings.OPENAI_API_KEY}` headers: {
}, Authorization: `Bearer ${$settings.OPENAI_API_KEY}`,
body: JSON.stringify({ 'Content-Type': 'application/json'
model: model, },
stream: true, body: JSON.stringify({
messages: [ model: model,
$settings.system stream: true,
? { messages: [
role: 'system', $settings.system
content: $settings.system
}
: undefined,
...messages
]
.filter((message) => message)
.map((message) => ({
role: message.role,
...(message.files
? { ? {
content: [ role: 'system',
{ content: $settings.system
type: 'text',
text: message.content
},
...message.files
.filter((file) => file.type === 'image')
.map((file) => ({
type: 'image_url',
image_url: {
url: file.url
}
}))
]
} }
: { content: message.content }) : undefined,
})), ...messages
temperature: $settings.temperature ?? undefined, ]
top_p: $settings.top_p ?? undefined, .filter((message) => message)
num_ctx: $settings.num_ctx ?? undefined, .map((message) => ({
frequency_penalty: $settings.repeat_penalty ?? undefined role: message.role,
}) ...(message.files
}); ? {
content: [
const reader = res.body {
.pipeThrough(new TextDecoderStream()) type: 'text',
.pipeThrough(splitStream('\n')) text: message.content
.getReader(); },
...message.files
while (true) { .filter((file) => file.type === 'image')
const { value, done } = await reader.read(); .map((file) => ({
if (done || stopResponseFlag || _chatId !== $chatId) { type: 'image_url',
responseMessage.done = true; image_url: {
messages = messages; url: file.url
break; }
}))
]
}
: { content: message.content })
})),
temperature: $settings.temperature ?? undefined,
top_p: $settings.top_p ?? undefined,
num_ctx: $settings.num_ctx ?? undefined,
frequency_penalty: $settings.repeat_penalty ?? undefined
})
} }
).catch((err) => {
console.log(err);
return null;
});
try { if (res && res.ok) {
let lines = value.split('\n'); const reader = res.body
.pipeThrough(new TextDecoderStream())
.pipeThrough(splitStream('\n'))
.getReader();
while (true) {
const { value, done } = await reader.read();
if (done || stopResponseFlag || _chatId !== $chatId) {
responseMessage.done = true;
messages = messages;
break;
}
for (const line of lines) { try {
if (line !== '') { let lines = value.split('\n');
console.log(line);
if (line === 'data: [DONE]') {
responseMessage.done = true;
messages = messages;
} else {
let data = JSON.parse(line.replace(/^data: /, ''));
console.log(data);
if (responseMessage.content == '' && data.choices[0].delta.content == '\n') { for (const line of lines) {
continue; if (line !== '') {
} else { console.log(line);
responseMessage.content += data.choices[0].delta.content ?? ''; if (line === 'data: [DONE]') {
responseMessage.done = true;
messages = messages; messages = messages;
} else {
let data = JSON.parse(line.replace(/^data: /, ''));
console.log(data);
if (responseMessage.content == '' && data.choices[0].delta.content == '\n') {
continue;
} else {
responseMessage.content += data.choices[0].delta.content ?? '';
messages = messages;
}
} }
} }
} }
} catch (error) {
console.log(error);
} }
} catch (error) {
console.log(error);
}
if (autoScroll) { if ($settings.notificationEnabled && !document.hasFocus()) {
window.scrollTo({ top: document.body.scrollHeight }); const notification = new Notification(`OpenAI ${model}`, {
} body: responseMessage.content,
icon: '/favicon.png'
});
}
await $db.updateChatById(_chatId, { if ($settings.responseAutoCopy) {
title: title === '' ? 'New Chat' : title, copyToClipboard(responseMessage.content);
models: selectedModels, }
system: $settings.system ?? undefined,
options: {
seed: $settings.seed ?? undefined,
temperature: $settings.temperature ?? undefined,
repeat_penalty: $settings.repeat_penalty ?? undefined,
top_k: $settings.top_k ?? undefined,
top_p: $settings.top_p ?? undefined,
num_ctx: $settings.num_ctx ?? undefined,
...($settings.options ?? {})
},
messages: messages,
history: history
});
}
stopResponseFlag = false; if (autoScroll) {
window.scrollTo({ top: document.body.scrollHeight });
}
await tick(); await $db.updateChatById(_chatId, {
title: title === '' ? 'New Chat' : title,
models: selectedModels,
system: $settings.system ?? undefined,
options: {
seed: $settings.seed ?? undefined,
temperature: $settings.temperature ?? undefined,
repeat_penalty: $settings.repeat_penalty ?? undefined,
top_k: $settings.top_k ?? undefined,
top_p: $settings.top_p ?? undefined,
num_ctx: $settings.num_ctx ?? undefined,
...($settings.options ?? {})
},
messages: messages,
history: history
});
}
} else {
if (res !== null) {
const error = await res.json();
console.log(error);
if ('detail' in error) {
toast.error(error.detail);
responseMessage.content = error.detail;
} else {
if ('message' in error.error) {
toast.error(error.error.message);
responseMessage.content = error.error.message;
} else {
toast.error(error.error);
responseMessage.content = error.error;
}
}
} else {
toast.error(`Uh-oh! There was an issue connecting to ${model}.`);
responseMessage.content = `Uh-oh! There was an issue connecting to ${model}.`;
}
if ($settings.notificationEnabled && !document.hasFocus()) { responseMessage.error = true;
const notification = new Notification(`OpenAI ${model}`, { responseMessage.content = `Uh-oh! There was an issue connecting to ${model}.`;
body: responseMessage.content, responseMessage.done = true;
icon: '/favicon.png' messages = messages;
});
} }
if ($settings.responseAutoCopy) { stopResponseFlag = false;
copyToClipboard(responseMessage.content); await tick();
}
if (autoScroll) { if (autoScroll) {
window.scrollTo({ top: document.body.scrollHeight }); window.scrollTo({ top: document.body.scrollHeight });
......
...@@ -6,7 +6,7 @@ ...@@ -6,7 +6,7 @@
import { onMount, tick } from 'svelte'; import { onMount, tick } from 'svelte';
import { convertMessagesToHistory, splitStream } from '$lib/utils'; import { convertMessagesToHistory, splitStream } from '$lib/utils';
import { goto } from '$app/navigation'; import { goto } from '$app/navigation';
import { config, modelfiles, user, settings, db, chats, chatId } from '$lib/stores'; import { config, models, modelfiles, user, settings, db, chats, chatId } from '$lib/stores';
import MessageInput from '$lib/components/chat/MessageInput.svelte'; import MessageInput from '$lib/components/chat/MessageInput.svelte';
import Messages from '$lib/components/chat/Messages.svelte'; import Messages from '$lib/components/chat/Messages.svelte';
...@@ -144,7 +144,8 @@ ...@@ -144,7 +144,8 @@
const sendPrompt = async (userPrompt, parentId, _chatId) => { const sendPrompt = async (userPrompt, parentId, _chatId) => {
await Promise.all( await Promise.all(
selectedModels.map(async (model) => { selectedModels.map(async (model) => {
if (model.includes('gpt-')) { console.log(model);
if ($models.filter((m) => m.name === model)[0].external) {
await sendPromptOpenAI(model, userPrompt, parentId, _chatId); await sendPromptOpenAI(model, userPrompt, parentId, _chatId);
} else { } else {
await sendPromptOllama(model, userPrompt, parentId, _chatId); await sendPromptOllama(model, userPrompt, parentId, _chatId);
...@@ -378,132 +379,162 @@ ...@@ -378,132 +379,162 @@
]; ];
} }
await tick();
window.scrollTo({ top: document.body.scrollHeight }); window.scrollTo({ top: document.body.scrollHeight });
const res = await fetch(`https://api.openai.com/v1/chat/completions`, { const res = await fetch(
method: 'POST', `${$settings.OPENAI_API_BASE_URL ?? 'https://api.openai.com/v1'}/chat/completions`,
headers: { {
'Content-Type': 'application/json', method: 'POST',
Authorization: `Bearer ${$settings.OPENAI_API_KEY}` headers: {
}, Authorization: `Bearer ${$settings.OPENAI_API_KEY}`,
body: JSON.stringify({ 'Content-Type': 'application/json'
model: model, },
stream: true, body: JSON.stringify({
messages: [ model: model,
$settings.system stream: true,
? { messages: [
role: 'system', $settings.system
content: $settings.system
}
: undefined,
...messages
]
.filter((message) => message)
.map((message) => ({
role: message.role,
...(message.files
? { ? {
content: [ role: 'system',
{ content: $settings.system
type: 'text',
text: message.content
},
...message.files
.filter((file) => file.type === 'image')
.map((file) => ({
type: 'image_url',
image_url: {
url: file.url
}
}))
]
} }
: { content: message.content }) : undefined,
})), ...messages
temperature: $settings.temperature ?? undefined, ]
top_p: $settings.top_p ?? undefined, .filter((message) => message)
num_ctx: $settings.num_ctx ?? undefined, .map((message) => ({
frequency_penalty: $settings.repeat_penalty ?? undefined role: message.role,
}) ...(message.files
}); ? {
content: [
const reader = res.body {
.pipeThrough(new TextDecoderStream()) type: 'text',
.pipeThrough(splitStream('\n')) text: message.content
.getReader(); },
...message.files
while (true) { .filter((file) => file.type === 'image')
const { value, done } = await reader.read(); .map((file) => ({
if (done || stopResponseFlag || _chatId !== $chatId) { type: 'image_url',
responseMessage.done = true; image_url: {
messages = messages; url: file.url
break; }
}))
]
}
: { content: message.content })
})),
temperature: $settings.temperature ?? undefined,
top_p: $settings.top_p ?? undefined,
num_ctx: $settings.num_ctx ?? undefined,
frequency_penalty: $settings.repeat_penalty ?? undefined
})
} }
).catch((err) => {
console.log(err);
return null;
});
try { if (res && res.ok) {
let lines = value.split('\n'); const reader = res.body
.pipeThrough(new TextDecoderStream())
.pipeThrough(splitStream('\n'))
.getReader();
while (true) {
const { value, done } = await reader.read();
if (done || stopResponseFlag || _chatId !== $chatId) {
responseMessage.done = true;
messages = messages;
break;
}
for (const line of lines) { try {
if (line !== '') { let lines = value.split('\n');
console.log(line);
if (line === 'data: [DONE]') {
responseMessage.done = true;
messages = messages;
} else {
let data = JSON.parse(line.replace(/^data: /, ''));
console.log(data);
if (responseMessage.content == '' && data.choices[0].delta.content == '\n') { for (const line of lines) {
continue; if (line !== '') {
} else { console.log(line);
responseMessage.content += data.choices[0].delta.content ?? ''; if (line === 'data: [DONE]') {
responseMessage.done = true;
messages = messages; messages = messages;
} else {
let data = JSON.parse(line.replace(/^data: /, ''));
console.log(data);
if (responseMessage.content == '' && data.choices[0].delta.content == '\n') {
continue;
} else {
responseMessage.content += data.choices[0].delta.content ?? '';
messages = messages;
}
} }
} }
} }
} catch (error) {
console.log(error);
} }
} catch (error) {
console.log(error);
}
if (autoScroll) { if ($settings.notificationEnabled && !document.hasFocus()) {
window.scrollTo({ top: document.body.scrollHeight }); const notification = new Notification(`OpenAI ${model}`, {
} body: responseMessage.content,
icon: '/favicon.png'
});
}
await $db.updateChatById(_chatId, { if ($settings.responseAutoCopy) {
title: title === '' ? 'New Chat' : title, copyToClipboard(responseMessage.content);
models: selectedModels, }
system: $settings.system ?? undefined,
options: {
seed: $settings.seed ?? undefined,
temperature: $settings.temperature ?? undefined,
repeat_penalty: $settings.repeat_penalty ?? undefined,
top_k: $settings.top_k ?? undefined,
top_p: $settings.top_p ?? undefined,
num_ctx: $settings.num_ctx ?? undefined,
...($settings.options ?? {})
},
messages: messages,
history: history
});
}
stopResponseFlag = false; if (autoScroll) {
window.scrollTo({ top: document.body.scrollHeight });
}
await tick(); await $db.updateChatById(_chatId, {
title: title === '' ? 'New Chat' : title,
models: selectedModels,
system: $settings.system ?? undefined,
options: {
seed: $settings.seed ?? undefined,
temperature: $settings.temperature ?? undefined,
repeat_penalty: $settings.repeat_penalty ?? undefined,
top_k: $settings.top_k ?? undefined,
top_p: $settings.top_p ?? undefined,
num_ctx: $settings.num_ctx ?? undefined,
...($settings.options ?? {})
},
messages: messages,
history: history
});
}
} else {
if (res !== null) {
const error = await res.json();
console.log(error);
if ('detail' in error) {
toast.error(error.detail);
responseMessage.content = error.detail;
} else {
if ('message' in error.error) {
toast.error(error.error.message);
responseMessage.content = error.error.message;
} else {
toast.error(error.error);
responseMessage.content = error.error;
}
}
} else {
toast.error(`Uh-oh! There was an issue connecting to ${model}.`);
responseMessage.content = `Uh-oh! There was an issue connecting to ${model}.`;
}
if ($settings.notificationEnabled && !document.hasFocus()) { responseMessage.error = true;
const notification = new Notification(`OpenAI ${model}`, { responseMessage.content = `Uh-oh! There was an issue connecting to ${model}.`;
body: responseMessage.content, responseMessage.done = true;
icon: '/favicon.png' messages = messages;
});
} }
if ($settings.responseAutoCopy) { stopResponseFlag = false;
copyToClipboard(responseMessage.content); await tick();
}
if (autoScroll) { if (autoScroll) {
window.scrollTo({ top: document.body.scrollHeight }); window.scrollTo({ top: document.body.scrollHeight });
......
...@@ -98,7 +98,7 @@ ...@@ -98,7 +98,7 @@
<div class=" flex space-x-4 cursor-pointer w-full mb-3"> <div class=" flex space-x-4 cursor-pointer w-full mb-3">
<a <a
class=" flex flex-1 space-x-4 cursor-pointer w-full" class=" flex flex-1 space-x-4 cursor-pointer w-full"
href={`/?models=${modelfile.tagName}`} href={`/?models=${encodeURIComponent(modelfile.tagName)}`}
> >
<div class=" self-center w-10"> <div class=" self-center w-10">
<div class=" rounded-full bg-stone-700"> <div class=" rounded-full bg-stone-700">
...@@ -121,7 +121,7 @@ ...@@ -121,7 +121,7 @@
<a <a
class="self-center w-fit text-sm px-2 py-2 border dark:border-gray-600 rounded-xl" class="self-center w-fit text-sm px-2 py-2 border dark:border-gray-600 rounded-xl"
type="button" type="button"
href={`/modelfiles/edit?tag=${modelfile.tagName}`} href={`/modelfiles/edit?tag=${encodeURIComponent(modelfile.tagName)}`}
> >
<svg <svg
xmlns="http://www.w3.org/2000/svg" xmlns="http://www.w3.org/2000/svg"
......
...@@ -93,7 +93,10 @@ SYSTEM """${system}"""`.replace(/^\s*\n/gm, ''); ...@@ -93,7 +93,10 @@ SYSTEM """${system}"""`.replace(/^\s*\n/gm, '');
}; };
const saveModelfile = async (modelfile) => { const saveModelfile = async (modelfile) => {
await modelfiles.set([...$modelfiles, modelfile]); await modelfiles.set([
...$modelfiles.filter((m) => m.tagName !== modelfile.tagName),
modelfile
]);
localStorage.setItem('modelfiles', JSON.stringify($modelfiles)); localStorage.setItem('modelfiles', JSON.stringify($modelfiles));
}; };
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment