README.md 255 Bytes
Newer Older
1
2
3
4
5
6
# Examples

vLLM-Omni's examples are split into two categories:

- If you are using vLLM-Omni from within Python code, see the *Offline Inference* section.
- If you are using vLLM-Omni from an HTTP application or client, see the *Online Serving* section.