"git@developer.sourcefind.cn:kecinstone/2024-pra-vllm.git" did not exist on "69be658bba3fcd97fbe23246406c64bd3f1cbe93"
Commit 0db4cc32 authored by weishb's avatar weishb
Browse files

更新readme版本

parent dcaef84a
Pipeline #3550 failed with stages
in 0 seconds
......@@ -52,6 +52,15 @@ docker run -it \
更多镜像可前往[光源](https://sourcefind.cn/#/service-list)下载使用。
关于本项目DCU显卡所需的特殊深度学习库可从[光合](https://developer.sourcefind.cn/tool/)开发者社区下载安装。
## 预训练权重
**请根据`支持的DCU型号`选择对应模型下载,FP8模型仅在BW1100/BW1101上支持,其他型号请勿使用!**
| **模型名称** | **权重大小** | **数据类型** | **支持的DCU型号** | **最低卡数需求** | **下载地址** |
| :------------------: | :----------: | :----------: | :---------------: | :--------------: | :----------------------------------------------------------: |
| Qwen3-Next-80B-A3B-Instruct | 80B | BF16 | BW1000,K100AI | 4 | [ModelScope](https://www.modelscope.cn/models/Qwen/Qwen3-Next-80B-A3B-Instruct) |
| Qwen3-Next-80B-A3B-Thinking | 80B | BF16 | BW1000,K100AI | 4 | [ModelScope](https://www.modelscope.cn/models/Qwen/Qwen3-Next-80B-A3B-Thinking) |
## 数据集
`暂无`
......@@ -62,12 +71,12 @@ docker run -it \
## 推理
## vllm
### vllm
#### 单机推理
注意:使用K100 AI 启动服务时需要添加--disable-custom-all-reduce参数
```bash
## serve启动
# serve启动
export HF_HUB_OFFLINE=1
export TRANSFORMERS_OFFLINE=1
vllm serve Qwen/Qwen3-Next-80B-A3B-Thinking \
......@@ -79,7 +88,7 @@ vllm serve Qwen/Qwen3-Next-80B-A3B-Thinking \
--max-model-len 8192 \
--port 8000
## client访问
# client访问
curl http://127.0.0.1:8000/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
......@@ -100,16 +109,6 @@ curl http://127.0.0.1:8000/v1/chat/completions \
`DCU与GPU精度一致,推理框架:vllm。`
## 预训练权重
| **模型名称** | **权重大小** | **DCU型号** | **最低卡数需求** | **下载地址** |
| :------------------: | :----------: | :-----------: | :--------------: | :----------------------------------------------------------: |
| Qwen3-Next-80B-A3B-Instruct | 80B | BW1000,K100AI | 4 | [Modelscope](https://www.modelscope.cn/models/Qwen/Qwen3-Next-80B-A3B-Instruct) |
| Qwen3-Next-80B-A3B-Thinking | 80B | BW1000,K100AI | 4 | [Modelscope](https://www.modelscope.cn/models/Qwen/Qwen3-Next-80B-A3B-Thinking) |
## 源码仓库及问题反馈
- https://developer.sourcefind.cn/codes/modelzoo/qwen3-next-80b-a3b_vllm
......
# No additional pip dependencies.
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment