Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
open-webui
Commits
abce172b
Unverified
Commit
abce172b
authored
May 27, 2024
by
Timothy Jaeryang Baek
Committed by
GitHub
May 27, 2024
Browse files
Merge pull request #2602 from cheahjs/feat/openai-usage-stats
feat: add OpenAI generation stats
parents
9d370c51
99b16616
Changes
5
Show whitespace changes
Inline
Side-by-side
Showing
5 changed files
with
60 additions
and
7 deletions
+60
-7
src/lib/apis/streaming/index.ts
src/lib/apis/streaming/index.ts
+15
-1
src/lib/components/chat/Chat.svelte
src/lib/components/chat/Chat.svelte
+16
-1
src/lib/components/chat/Messages/ResponseMessage.svelte
src/lib/components/chat/Messages/ResponseMessage.svelte
+11
-5
src/routes/(app)/workspace/models/create/+page.svelte
src/routes/(app)/workspace/models/create/+page.svelte
+14
-0
src/routes/(app)/workspace/models/edit/+page.svelte
src/routes/(app)/workspace/models/edit/+page.svelte
+4
-0
No files found.
src/lib/apis/streaming/index.ts
View file @
abce172b
...
...
@@ -8,6 +8,16 @@ type TextStreamUpdate = {
citations
?:
any
;
// eslint-disable-next-line @typescript-eslint/no-explicit-any
error
?:
any
;
usage
?:
ResponseUsage
;
};
type
ResponseUsage
=
{
/** Including images and tools if any */
prompt_tokens
:
number
;
/** The tokens generated */
completion_tokens
:
number
;
/** Sum of the above two fields */
total_tokens
:
number
;
};
// createOpenAITextStream takes a responseBody with a SSE response,
...
...
@@ -59,7 +69,11 @@ async function* openAIStreamToIterator(
continue
;
}
yield
{
done
:
false
,
value
:
parsedData
.
choices
?.[
0
]?.
delta
?.
content
??
''
};
yield
{
done
:
false
,
value
:
parsedData
.
choices
?.[
0
]?.
delta
?.
content
??
''
,
usage
:
parsedData
.
usage
};
}
catch
(
e
)
{
console
.
error
(
'
Error extracting delta from SSE event:
'
,
e
);
}
...
...
src/lib/components/chat/Chat.svelte
View file @
abce172b
...
...
@@ -767,6 +767,12 @@
{
model: model.id,
stream: true,
stream_options:
model.info?.meta?.capabilities?.usage ?? false
? {
include_usage: true
}
: undefined,
messages: [
$settings.system || (responseMessage?.userContext ?? null)
? {
...
...
@@ -835,9 +841,10 @@
if (res && res.ok && res.body) {
const textStream = await createOpenAITextStream(res.body, $settings.splitLargeChunks);
let lastUsage = null;
for await (const update of textStream) {
const { value, done, citations, error } = update;
const { value, done, citations, error
, usage
} = update;
if (error) {
await handleOpenAIError(error, null, model, responseMessage);
break;
...
...
@@ -853,6 +860,10 @@
break;
}
if (usage) {
lastUsage = usage;
}
if (citations) {
responseMessage.citations = citations;
continue;
...
...
@@ -886,6 +897,10 @@
}
}
if (lastUsage) {
responseMessage.info = { ...lastUsage, openai: true };
}
if ($chatId == _chatId) {
if ($settings.saveChatHistory ?? true) {
chat = await updateChatById(localStorage.token, _chatId, {
...
...
src/lib/components/chat/Messages/ResponseMessage.svelte
View file @
abce172b
...
...
@@ -108,8 +108,13 @@
renderLatex();
if (message.info) {
tooltipInstance = tippy(`#info-${message.id}`, {
content: `<span class="text-xs" id="tooltip-${message.id}">response_token/s: ${
let tooltipContent = '';
if (message.info.openai) {
tooltipContent = `prompt_tokens: ${message.info.prompt_tokens ?? 'N/A'}<br/>
completion_tokens: ${message.info.completion_tokens ?? 'N/A'}<br/>
total_tokens: ${message.info.total_tokens ?? 'N/A'}`;
} else {
tooltipContent = `response_token/s: ${
`${
Math.round(
((message.info.eval_count ?? 0) / (message.info.eval_duration / 1000000000)) * 100
...
...
@@ -139,9 +144,10 @@
eval_duration: ${
Math.round(((message.info.eval_duration ?? 0) / 1000000) * 100) / 100 ?? 'N/A'
}ms<br/>
approximate_total: ${approximateToHumanReadable(
message.info.total_duration
)}</span>`,
approximate_total: ${approximateToHumanReadable(message.info.total_duration)}`;
}
tooltipInstance = tippy(`#info-${message.id}`, {
content: `<span class="text-xs" id="tooltip-${message.id}">${tooltipContent}</span>`,
allowHTML: true
});
}
...
...
src/routes/(app)/workspace/models/create/+page.svelte
View file @
abce172b
...
...
@@ -56,6 +56,20 @@
id = name.replace(/\s+/g, '-').toLowerCase();
}
let baseModel = null;
$: {
baseModel = $models.find((m) => m.id === info.base_model_id);
console.log(baseModel);
if (baseModel) {
if (baseModel.owned_by === 'openai') {
capabilities.usage = baseModel.info?.meta?.capabilities?.usage ?? false;
} else {
delete capabilities.usage;
}
capabilities = capabilities;
}
}
const submitHandler = async () => {
loading = true;
...
...
src/routes/(app)/workspace/models/edit/+page.svelte
View file @
abce172b
...
...
@@ -107,6 +107,10 @@
params
=
{
...
params
,
...
model
?.
info
?.
params
};
params
.
stop
=
params
?.
stop
?
(
params
?.
stop
??
[]).
join
(
','
)
:
null
;
if
(
model
?.
owned_by
===
'openai'
)
{
capabilities
.
usage
=
false
;
}
if
(
model
?.
info
?.
meta
?.
capabilities
)
{
capabilities
=
{
...
capabilities
,
...
model
?.
info
?.
meta
?.
capabilities
};
}
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment