Commit bb7fa6cc authored by weishb's avatar weishb
Browse files

update README.md

parent a808c524
Pipeline #3425 failed with stages
......@@ -68,14 +68,14 @@ python inference.py
## serve启动
export VLLM_USE_FUSED_RMS_ROPE=0
vllm serve Qwen3-ASR/Qwen3-ASR-1.7B \
vllm serve /path/Qwen3-ASR/Qwen3-ASR-1.7B \
--trust-remote-code \
--limit-mm-per-prompt '{"audio": 1}'
## client访问
curl -X POST "http://127.0.0.1:8000/v1/audio/transcriptions" \
-F "file=@/path/to/test.wav" \
-F "model=/public/home/weishb/Qwen3-ASR/Qwen3-ASR-1.7B"
-F "model=/path/Qwen3-ASR/Qwen3-ASR-1.7B"
```
## 效果展示
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment