"vscode:/vscode.git/clone" did not exist on "9fb02175485db873664cd5841c72add6ac512692"
llm: Allow overriding flash attention setting
As we automatically enable flash attention for more models, there are likely some cases where we get it wrong. This allows setting OLLAMA_FLASH_ATTENTION=0 to disable it, even for models that usually have flash attention.
Showing
Please register or sign in to comment