Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
change
sglang
Commits
bc068e96
"src/graph/transform/to_block.cc" did not exist on "e3a9a6bba873fe2cac4d96ec4b72f58ca8223479"
Unverified
Commit
bc068e96
authored
Sep 24, 2024
by
Lianmin Zheng
Committed by
GitHub
Sep 24, 2024
Browse files
[CI] Move AMD test to a separate file (#1500)
parent
8d4ed42a
Changes
2
Hide whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
50 additions
and
23 deletions
+50
-23
.github/workflows/pr-test-amd.yml
.github/workflows/pr-test-amd.yml
+50
-0
.github/workflows/pr-test.yml
.github/workflows/pr-test.yml
+0
-23
No files found.
.github/workflows/pr-test-amd.yml
0 → 100644
View file @
bc068e96
name
:
PR Test (AMD)
on
:
push
:
branches
:
[
main
]
paths
:
-
"
python/sglang/**"
-
"
test/**"
pull_request
:
branches
:
[
main
]
paths
:
-
"
python/sglang/**"
-
"
test/**"
workflow_dispatch
:
concurrency
:
group
:
pr-test-${{ github.ref }}
cancel-in-progress
:
true
jobs
:
accuracy-test-1-gpu
:
if
:
github.repository == 'sgl-project/sglang' || github.event_name == 'pull_request'
runs-on
:
1-gpu-runner-amd
steps
:
-
name
:
Checkout code
uses
:
actions/checkout@v3
-
name
:
Install dependencies
run
:
|
pip install --upgrade pip
pip install -e "python[all]" --no-deps
git clone https://github.com/merrymercy/human-eval.git
cd human-eval
pip install -e .
-
name
:
Evaluate Accuracy
timeout-minutes
:
20
run
:
|
cd test/srt
python3 test_eval_accuracy_large.py
finish
:
needs
:
[
accuracy-test-1-gpu
]
runs-on
:
ubuntu-latest
steps
:
-
name
:
Finish
run
:
echo "This is an empty step to ensure that all jobs are completed."
.github/workflows/pr-test.yml
View file @
bc068e96
...
@@ -187,7 +187,6 @@ jobs:
...
@@ -187,7 +187,6 @@ jobs:
cd test/srt
cd test/srt
python3 -m unittest test_bench_latency.TestBenchLatency.test_moe_default
python3 -m unittest test_bench_latency.TestBenchLatency.test_moe_default
accuracy-test-1-gpu
:
accuracy-test-1-gpu
:
if
:
github.repository == 'sgl-project/sglang' || github.event_name == 'pull_request'
if
:
github.repository == 'sgl-project/sglang' || github.event_name == 'pull_request'
runs-on
:
1-gpu-runner
runs-on
:
1-gpu-runner
...
@@ -247,28 +246,6 @@ jobs:
...
@@ -247,28 +246,6 @@ jobs:
cd test/srt
cd test/srt
python3 test_data_parallelism.py
python3 test_data_parallelism.py
accuracy-test-1-gpu-amd
:
if
:
github.repository == 'sgl-project/sglang' || github.event_name == 'pull_request'
runs-on
:
1-gpu-runner-amd
steps
:
-
name
:
Checkout code
uses
:
actions/checkout@v3
-
name
:
Install dependencies
run
:
|
pip install --upgrade pip
pip install -e "python[all]" --no-deps
git clone https://github.com/merrymercy/human-eval.git
cd human-eval
pip install -e .
-
name
:
Evaluate Accuracy
timeout-minutes
:
20
run
:
|
cd test/srt
python3 test_eval_accuracy_large.py
finish
:
finish
:
needs
:
[
needs
:
[
unit-test-frontend
,
unit-test-backend-part-1
,
unit-test-backend-part-2
,
unit-test-backend-part-3
,
unit-test-frontend
,
unit-test-backend-part-1
,
unit-test-backend-part-2
,
unit-test-backend-part-3
,
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment