Unverified Commit 8c95a8be authored by Timothy Jaeryang Baek's avatar Timothy Jaeryang Baek Committed by GitHub
Browse files

Merge pull request #2943 from choltha/fix/temperature-params

fix: model settings temperature not passed correctly from model settings to ollama/openai api
parents d00c8b00 97b39115
...@@ -764,7 +764,7 @@ async def generate_chat_completion( ...@@ -764,7 +764,7 @@ async def generate_chat_completion(
"frequency_penalty", None "frequency_penalty", None
) )
if model_info.params.get("temperature", None): if model_info.params.get("temperature", None) is not None:
payload["options"]["temperature"] = model_info.params.get( payload["options"]["temperature"] = model_info.params.get(
"temperature", None "temperature", None
) )
......
...@@ -373,8 +373,8 @@ async def proxy(path: str, request: Request, user=Depends(get_verified_user)): ...@@ -373,8 +373,8 @@ async def proxy(path: str, request: Request, user=Depends(get_verified_user)):
model_info.params = model_info.params.model_dump() model_info.params = model_info.params.model_dump()
if model_info.params: if model_info.params:
if model_info.params.get("temperature", None): if model_info.params.get("temperature", None) is not None:
payload["temperature"] = int( payload["temperature"] = float(
model_info.params.get("temperature") model_info.params.get("temperature")
) )
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment