Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
ModelZoo
Qwen3-Reranker_pytorch
Commits
63aaaabc
Commit
63aaaabc
authored
Aug 13, 2025
by
chenych
Browse files
Update to vllm-0.9.2
parent
2d237d09
Changes
2
Hide whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
25 additions
and
10 deletions
+25
-10
README.md
README.md
+24
-9
docker/Dockerfile
docker/Dockerfile
+1
-1
No files found.
README.md
View file @
63aaaabc
...
@@ -19,11 +19,10 @@ Qwen3嵌入模型系列是Qwen3家族最新的专有模型,专门为文本嵌
...
@@ -19,11 +19,10 @@ Qwen3嵌入模型系列是Qwen3家族最新的专有模型,专门为文本嵌
### Docker(方法一)
### Docker(方法一)
```
bash
```
bash
docker pull
docker pull image.sourcefind.cn:5000/dcu/admin/base/vllm:0.
8.5
-ubuntu22.04-dtk25.04.1-rc5-das1.6-py3.10-20250
711
docker pull image.sourcefind.cn:5000/dcu/admin/base/vllm:0.
9.2
-ubuntu22.04-dtk25.04.1-rc5-
rocblas101839-0811-
das1.6-py3.10-20250
812-beta
docker run
-it
--shm-size
200g
--network
=
host
--name
{
docker_name
}
--privileged
--device
=
/dev/kfd
--device
=
/dev/dri
--device
=
/dev/mkfd
--group-add
video
--cap-add
=
SYS_PTRACE
--security-opt
seccomp
=
unconfined
-u
root
-v
/path/your_code_data/:/path/your_code_data/
-v
/opt/hyhal/:/opt/hyhal/:ro
{
imageID
}
bash
docker run
-it
--shm-size
200g
--network
=
host
--name
{
docker_name
}
--privileged
--device
=
/dev/kfd
--device
=
/dev/dri
--device
=
/dev/mkfd
--group-add
video
--cap-add
=
SYS_PTRACE
--security-opt
seccomp
=
unconfined
-u
root
-v
/path/your_code_data/:/path/your_code_data/
-v
/opt/hyhal/:/opt/hyhal/:ro
{
imageID
}
bash
cd
/your_code_path/qwen3-reranker_pytorch
cd
/your_code_path/qwen3-reranker_pytorch
pip
install
transformers>
=
4.51.0
```
```
### Dockerfile(方法二)
### Dockerfile(方法二)
...
@@ -33,7 +32,6 @@ docker build --no-cache -t qwen3-reranker:latest .
...
@@ -33,7 +32,6 @@ docker build --no-cache -t qwen3-reranker:latest .
docker run
-it
--shm-size
200g
--network
=
host
--name
{
docker_name
}
--privileged
--device
=
/dev/kfd
--device
=
/dev/dri
--device
=
/dev/mkfd
--group-add
video
--cap-add
=
SYS_PTRACE
--security-opt
seccomp
=
unconfined
-u
root
-v
/path/your_code_data/:/path/your_code_data/
-v
/opt/hyhal/:/opt/hyhal/:ro
{
imageID
}
bash
docker run
-it
--shm-size
200g
--network
=
host
--name
{
docker_name
}
--privileged
--device
=
/dev/kfd
--device
=
/dev/dri
--device
=
/dev/mkfd
--group-add
video
--cap-add
=
SYS_PTRACE
--security-opt
seccomp
=
unconfined
-u
root
-v
/path/your_code_data/:/path/your_code_data/
-v
/opt/hyhal/:/opt/hyhal/:ro
{
imageID
}
bash
cd
/your_code_path/qwen3-reranker_pytorch
cd
/your_code_path/qwen3-reranker_pytorch
pip
install
transformers>
=
4.51.0
```
```
### Anaconda(方法三)
### Anaconda(方法三)
...
@@ -41,9 +39,9 @@ pip install transformers>=4.51.0
...
@@ -41,9 +39,9 @@ pip install transformers>=4.51.0
```
bash
```
bash
DTK: 25.04
DTK: 25.04
python: 3.10
python: 3.10
vllm: 0.
8.5
vllm: 0.
9.2+das.opt1.beta.dtk25041
torch: 2.
4
.1+das.opt
2
.dtk2504
torch: 2.
5
.1+das.opt
1
.dtk2504
1
deepspeed: 0.14.2+das.opt
2
.dtk2504
deepspeed: 0.14.2+das.opt
1
.dtk2504
1
```
```
`Tips:以上dtk驱动、python、torch等DCU相关工具版本需要严格一一对应`
`Tips:以上dtk驱动、python、torch等DCU相关工具版本需要严格一一对应`
...
@@ -60,17 +58,34 @@ pip install transformers>=4.51.0
...
@@ -60,17 +58,34 @@ pip install transformers>=4.51.0
## 推理
## 推理
### vllm推理方法
### vllm推理方法
vllm 0.8.5不支持serve模式启动推理,offline方式请参考项目脚本
`infer_vllm.py`
。
#### offline
#### offline
```
bash
```
bash
## 必须添加HF_ENDPOINT环境变量
## 必须添加HF_ENDPOINT环境变量
export
HF_ENDPOINT
=
https://hf-mirror.com
export
HF_ENDPOINT
=
https://hf-mirror.com
export
VLLM_USE_NN
=
0
export
ALLREDUCE_STREAM_WITH_COMPUTE
=
1
## model_name_or_path 模型地址参数
## model_name_or_path 模型地址参数
python infer_vllm.py
--model_name_or_path
/path/your_model_path/
python infer_vllm.py
--model_name_or_path
/path/your_model_path/
```
```
#### serve
export HF_ENDPOINT=https://hf-mirror.com
export VLLM_USE_NN=0
export ALLREDUCE_STREAM_WITH_COMPUTE=1
```
bash
vllm serve Qwen/Qwen3-Reranker-0.6B
--max-model-len
4096
--trust-remote-code
--enforce-eager
--enable-prefix-caching
--served-model-name
Qwen3-reranker
--task
score
--disable-log-requests
--hf_overrides
'{"architectures":["Qwen3ForSequenceClassification"],"classifier_from_token": ["no", "yes"],"is_original_qwen3_reranker": true}'
```
测试命令:
```
bash
curl http://127.0.0.1:8000/score
-H
'accept: application/json'
-H
'Content-Type: application/json'
-d
'{
"text_1": "ping",
"text_2": "pong",
"model": "Qwen3-reranker"
}'
```
## result
## result
<div
align=
center
>
<div
align=
center
>
<img
src=
"./doc/results-dcu.png"
/>
<img
src=
"./doc/results-dcu.png"
/>
...
...
docker/Dockerfile
View file @
63aaaabc
FROM
image.sourcefind.cn:5000/dcu/admin/base/custom:vllm0.8.5-ubuntu22.04-dtk25.04-rc7-das1.5-py3.10-20250612-fixpy-rocblas0611-rc2
FROM
image.sourcefind.cn:5000/dcu/admin/base/vllm:0.9.2-ubuntu22.04-dtk25.04.1-rc5-rocblas101839-0811-das1.6-py3.10-20250812-beta
\ No newline at end of file
\ No newline at end of file
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment