1. 22 Jul, 2024 1 commit
    • Daniel Hiltgen's avatar
      Enable windows error dialog for subprocess startup · e12fff88
      Daniel Hiltgen authored
      Make sure if something goes wrong spawning the process, the user gets
      enough info to be able to try to self correct, or at least file a bug
      with details so we can fix it.  Once the process starts, we immediately
      change back to the recommended setting to prevent the blocking dialog.
      This ensures if the model fails to load (OOM, unsupported model type,
      etc.) the process will exit quickly and we can scan the stdout/stderr
      of the subprocess for the reason to report via API.
      e12fff88
  2. 01 Apr, 2024 1 commit
    • Daniel Hiltgen's avatar
      Switch back to subprocessing for llama.cpp · 58d95cc9
      Daniel Hiltgen authored
      This should resolve a number of memory leak and stability defects by allowing
      us to isolate llama.cpp in a separate process and shutdown when idle, and
      gracefully restart if it has problems.  This also serves as a first step to be
      able to run multiple copies to support multiple models concurrently.
      58d95cc9