Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
diffusers
Commits
c1edb03c
Unverified
Commit
c1edb03c
authored
May 01, 2024
by
Dhruv Nair
Committed by
GitHub
May 01, 2024
Browse files
Fix for pipeline slow test fetcher (#7824)
* update * update
parent
0d083702
Changes
2
Hide whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
23 additions
and
23 deletions
+23
-23
.github/workflows/nightly_tests.yml
.github/workflows/nightly_tests.yml
+22
-22
.github/workflows/push_tests.yml
.github/workflows/push_tests.yml
+1
-1
No files found.
.github/workflows/nightly_tests.yml
View file @
c1edb03c
...
...
@@ -19,7 +19,7 @@ env:
jobs
:
setup_torch_cuda_pipeline_matrix
:
name
:
Setup Torch Pipelines Matrix
runs-on
:
ubuntu-latest
runs-on
:
diffusers/diffusers-pytorch-cpu
outputs
:
pipeline_test_matrix
:
${{ steps.fetch_pipeline_matrix.outputs.pipeline_test_matrix }}
steps
:
...
...
@@ -67,19 +67,19 @@ jobs:
fetch-depth
:
2
-
name
:
NVIDIA-SMI
run
:
nvidia-smi
-
name
:
Install dependencies
run
:
|
python -m venv /opt/venv && export PATH="/opt/venv/bin:$PATH"
python -m uv pip install -e [quality,test]
python -m uv pip install accelerate@git+https://github.com/huggingface/accelerate.git
python -m uv pip install pytest-reportlog
-
name
:
Environment
run
:
|
python utils/print_env.py
-
name
:
Nightly PyTorch CUDA checkpoint (pipelines) tests
-
name
:
Nightly PyTorch CUDA checkpoint (pipelines) tests
env
:
HUGGING_FACE_HUB_TOKEN
:
${{ secrets.HUGGING_FACE_HUB_TOKEN }}
# https://pytorch.org/docs/stable/notes/randomness.html#avoiding-nondeterministic-algorithms
...
...
@@ -88,9 +88,9 @@ jobs:
python -m pytest -n 1 --max-worker-restart=0 --dist=loadfile \
-s -v -k "not Flax and not Onnx" \
--make-reports=tests_pipeline_${{ matrix.module }}_cuda \
--report-log=tests_pipeline_${{ matrix.module }}_cuda.log \
--report-log=tests_pipeline_${{ matrix.module }}_cuda.log \
tests/pipelines/${{ matrix.module }}
-
name
:
Failure short reports
if
:
${{ failure() }}
run
:
|
...
...
@@ -103,7 +103,7 @@ jobs:
with
:
name
:
pipeline_${{ matrix.module }}_test_reports
path
:
reports
-
name
:
Generate Report and Notify Channel
if
:
always()
run
:
|
...
...
@@ -139,7 +139,7 @@ jobs:
run
:
python utils/print_env.py
-
name
:
Run nightly PyTorch CUDA tests for non-pipeline modules
if
:
${{ matrix.module != 'examples'}}
if
:
${{ matrix.module != 'examples'}}
env
:
HUGGING_FACE_HUB_TOKEN
:
${{ secrets.HUGGING_FACE_HUB_TOKEN }}
# https://pytorch.org/docs/stable/notes/randomness.html#avoiding-nondeterministic-algorithms
...
...
@@ -148,7 +148,7 @@ jobs:
python -m pytest -n 1 --max-worker-restart=0 --dist=loadfile \
-s -v -k "not Flax and not Onnx" \
--make-reports=tests_torch_${{ matrix.module }}_cuda \
--report-log=tests_torch_${{ matrix.module }}_cuda.log \
--report-log=tests_torch_${{ matrix.module }}_cuda.log \
tests/${{ matrix.module }}
-
name
:
Run nightly example tests with Torch
...
...
@@ -161,13 +161,13 @@ jobs:
python -m uv pip install peft@git+https://github.com/huggingface/peft.git
python -m pytest -n 1 --max-worker-restart=0 --dist=loadfile \
-s -v --make-reports=examples_torch_cuda \
--report-log=examples_torch_cuda.log \
--report-log=examples_torch_cuda.log \
examples/
-
name
:
Failure short reports
if
:
${{ failure() }}
run
:
|
cat reports/tests_torch_${{ matrix.module }}_cuda_stats.txt
cat reports/tests_torch_${{ matrix.module }}_cuda_stats.txt
cat reports/tests_torch_${{ matrix.module }}_cuda_failures_short.txt
-
name
:
Test suite reports artifacts
...
...
@@ -218,13 +218,13 @@ jobs:
python -m pytest -n 1 --max-worker-restart=0 --dist=loadfile \
-s -v -k "not Flax and not Onnx" \
--make-reports=tests_torch_lora_cuda \
--report-log=tests_torch_lora_cuda.log \
--report-log=tests_torch_lora_cuda.log \
tests/lora
-
name
:
Failure short reports
if
:
${{ failure() }}
run
:
|
cat reports/tests_torch_lora_cuda_stats.txt
cat reports/tests_torch_lora_cuda_stats.txt
cat reports/tests_torch_lora_cuda_failures_short.txt
-
name
:
Test suite reports artifacts
...
...
@@ -239,12 +239,12 @@ jobs:
run
:
|
pip install slack_sdk tabulate
python scripts/log_reports.py >> $GITHUB_STEP_SUMMARY
run_flax_tpu_tests
:
name
:
Nightly Flax TPU Tests
runs-on
:
docker-tpu
if
:
github.event_name == 'schedule'
container
:
image
:
diffusers/diffusers-flax-tpu
options
:
--shm-size "16gb" --ipc host -v /mnt/hf_cache:/mnt/cache/ --privileged
...
...
@@ -274,7 +274,7 @@ jobs:
python -m pytest -n 0 \
-s -v -k "Flax" \
--make-reports=tests_flax_tpu \
--report-log=tests_flax_tpu.log \
--report-log=tests_flax_tpu.log \
tests/
-
name
:
Failure short reports
...
...
@@ -302,7 +302,7 @@ jobs:
container
:
image
:
diffusers/diffusers-onnxruntime-cuda
options
:
--gpus 0 --shm-size "16gb" --ipc host -v /mnt/hf_cache:/mnt/cache/
steps
:
-
name
:
Checkout diffusers
uses
:
actions/checkout@v3
...
...
@@ -321,7 +321,7 @@ jobs:
-
name
:
Environment
run
:
python utils/print_env.py
-
name
:
Run nightly ONNXRuntime CUDA tests
env
:
HUGGING_FACE_HUB_TOKEN
:
${{ secrets.HUGGING_FACE_HUB_TOKEN }}
...
...
@@ -329,7 +329,7 @@ jobs:
python -m pytest -n 1 --max-worker-restart=0 --dist=loadfile \
-s -v -k "Onnx" \
--make-reports=tests_onnx_cuda \
--report-log=tests_onnx_cuda.log \
--report-log=tests_onnx_cuda.log \
tests/
-
name
:
Failure short reports
...
...
@@ -344,7 +344,7 @@ jobs:
with
:
name
:
${{ matrix.config.report }}_test_reports
path
:
reports
-
name
:
Generate Report and Notify Channel
if
:
always()
run
:
|
...
...
.github/workflows/push_tests.yml
View file @
c1edb03c
...
...
@@ -21,7 +21,7 @@ env:
jobs
:
setup_torch_cuda_pipeline_matrix
:
name
:
Setup Torch Pipelines CUDA Slow Tests Matrix
runs-on
:
ubuntu-latest
runs-on
:
diffusers/diffusers-pytorch-cpu
outputs
:
pipeline_test_matrix
:
${{ steps.fetch_pipeline_matrix.outputs.pipeline_test_matrix }}
steps
:
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment