Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
change
sglang
Commits
f42e9bfb
Unverified
Commit
f42e9bfb
authored
Sep 29, 2024
by
Kylin
Committed by
GitHub
Sep 28, 2024
Browse files
[bugfix] Add modelscope package to avoid docker image without modelscope (#1520)
parent
840c5dbc
Changes
3
Hide whitespace changes
Inline
Side-by-side
Showing
3 changed files
with
16 additions
and
1 deletion
+16
-1
README.md
README.md
+11
-0
docker/compose.yaml
docker/compose.yaml
+4
-0
python/pyproject.toml
python/pyproject.toml
+1
-1
No files found.
README.md
View file @
f42e9bfb
...
...
@@ -286,6 +286,17 @@ Launch [Qwen2-7B-Instruct](https://www.modelscope.cn/models/qwen/qwen2-7b-instru
```
SGLANG_USE_MODELSCOPE=true python -m sglang.launch_server --model-path qwen/Qwen2-7B-Instruct --port 30000
```
Or start it by docker.
```
bash
docker run
--gpus
all
\
-p
30000:30000
\
-v
~/.cache/modelscope:/root/.cache/modelscope
\
--env
"SGLANG_USE_MODELSCOPE=true"
\
--ipc
=
host
\
lmsysorg/sglang:latest
\
python3
-m
sglang.launch_server
--model-path
Qwen/Qwen2.5-7B-Instruct
--host
0.0.0.0
--port
30000
```
</details>
...
...
docker/compose.yaml
View file @
f42e9bfb
...
...
@@ -4,6 +4,8 @@ services:
container_name
:
sglang
volumes
:
-
${HOME}/.cache/huggingface:/root/.cache/huggingface
# If you use modelscope, you need mount this directory
# - ${HOME}/.cache/modelscope:/root/.cache/modelscope
restart
:
always
network_mode
:
host
# Or you can only publish port 30000
...
...
@@ -11,6 +13,8 @@ services:
# - 30000:30000
environment
:
HF_TOKEN
:
<secret>
# if you use modelscope to download model, you need set this environment
# - SGLANG_USE_MODELSCOPE: true
entrypoint
:
python3 -m sglang.launch_server
command
:
--model-path meta-llama/Meta-Llama-3.1-8B-Instruct
...
...
python/pyproject.toml
View file @
f42e9bfb
...
...
@@ -23,7 +23,7 @@ dependencies = [
srt
=
[
"aiohttp"
,
"decord"
,
"fastapi"
,
"hf_transfer"
,
"huggingface_hub"
,
"interegular"
,
"packaging"
,
"pillow"
,
"psutil"
,
"pydantic"
,
"python-multipart"
,
"torch"
,
"torchao"
,
"uvicorn"
,
"uvloop"
,
"zmq"
,
"vllm==0.5.5"
,
"outlines>=0.0.44"
]
"vllm==0.5.5"
,
"outlines>=0.0.44"
,
"modelscope"
]
openai
=
[
"openai>=1.0"
,
"tiktoken"
]
anthropic
=
["anthropic>=0.20.0"]
litellm
=
["litellm>=1.0.0"]
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment