"tests/vscode:/vscode.git/clone" did not exist on "b5e1facc85219770b6e85a9cfb2ec554167ccedc"
  1. 14 Feb, 2024 2 commits
  2. 12 Feb, 2024 3 commits
  3. 09 Feb, 2024 1 commit
    • Daniel Hiltgen's avatar
      Shutdown faster · 66807615
      Daniel Hiltgen authored
      Make sure that when a shutdown signal comes, we shutdown quickly instead
      of waiting for a potentially long exchange to wrap up.
      66807615
  4. 08 Feb, 2024 1 commit
  5. 06 Feb, 2024 1 commit
  6. 02 Feb, 2024 1 commit
  7. 01 Feb, 2024 2 commits
  8. 31 Jan, 2024 1 commit
  9. 29 Jan, 2024 1 commit
  10. 25 Jan, 2024 3 commits
  11. 24 Jan, 2024 1 commit
  12. 23 Jan, 2024 2 commits
  13. 22 Jan, 2024 4 commits
  14. 21 Jan, 2024 3 commits
  15. 20 Jan, 2024 3 commits
  16. 19 Jan, 2024 4 commits
  17. 18 Jan, 2024 1 commit
  18. 17 Jan, 2024 1 commit
  19. 16 Jan, 2024 2 commits
    • Daniel Hiltgen's avatar
      Bump llama.cpp to b1842 and add new cuda lib dep · 795674dd
      Daniel Hiltgen authored
      Upstream llama.cpp has added a new dependency with the
      NVIDIA CUDA Driver Libraries (libcuda.so) which is part of the
      driver distribution, not the general cuda libraries, and is not
      available as an archive, so we can not statically link it.  This may
      introduce some additional compatibility challenges which we'll
      need to keep an eye on.
      795674dd
    • Bruce MacDonald's avatar
      do not cache prompt (#2018) · a897e833
      Bruce MacDonald authored
      - prompt cache causes inferance to hang after some time
      a897e833
  20. 14 Jan, 2024 3 commits