"examples/llm/utils/protocol.py" did not exist on "c9130f8f8ce264379131e9ee2973534fe4cbf713"
  1. 21 Apr, 2025 1 commit
  2. 01 Apr, 2025 1 commit
  3. 14 Mar, 2025 1 commit
  4. 08 Mar, 2025 1 commit
  5. 05 Mar, 2025 1 commit
  6. 25 Feb, 2025 1 commit
  7. 22 Feb, 2025 1 commit
  8. 21 Feb, 2025 1 commit
    • Graham King's avatar
      feat(tio): Distributed inference! (#235) · 32a748e4
      Graham King authored
      Add support in tio for distributed components and discovery.
      
      Node 1:
      ```
      tio in=http out=tdr://ns/backend/mistralrs
      ```
      
      Node 2:
      ```
      tio in=tdr://ns/backend/mistralrs out=mistralrs ~/llm_models/Llama-3.2-3B-Instruct
      ```
      
      This will use etcd to auto-discover the model and NATS to talk to it. You can run multiple workers on the same endpoint and it will pick one at random each time.
      
      The `ns/backend/mistralrs` are purely symbolic, pick anything as long as it has three parts, and it matches the other node.
      32a748e4
  9. 18 Feb, 2025 1 commit
  10. 13 Feb, 2025 1 commit