modelscope.md 870 Bytes
Newer Older
1
2
3
4
5
6
7
8
# Use Models From ModelScope

To use a model from [ModelScope](https://www.modelscope.cn), set the environment variable `SGLANG_USE_MODELSCOPE`.

```bash
export SGLANG_USE_MODELSCOPE=true
```

9
We take [Qwen2-7B-Instruct](https://www.modelscope.cn/models/qwen/qwen2-7b-instruct) as an example.
10

11
Launch the Server:
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
```bash
python -m sglang.launch_server --model-path qwen/Qwen2-7B-Instruct --port 30000
```

Or start it by docker:

```bash
docker run --gpus all \
    -p 30000:30000 \
    -v ~/.cache/modelscope:/root/.cache/modelscope \
    --env "SGLANG_USE_MODELSCOPE=true" \
    --ipc=host \
    lmsysorg/sglang:latest \
    python3 -m sglang.launch_server --model-path Qwen/Qwen2.5-7B-Instruct --host 0.0.0.0 --port 30000
```

28
Note that modelscope uses a different cache directory than huggingface. You may need to set it manually to avoid running out of disk space.