Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
renzhc
diffusers_dcu
Commits
11126991
Unverified
Commit
11126991
authored
Jun 15, 2022
by
Anton Lozhkov
Committed by
GitHub
Jun 15, 2022
Browse files
add a training examples doc
parent
52a9acfa
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
48 additions
and
0 deletions
+48
-0
examples/README.md
examples/README.md
+48
-0
No files found.
examples/README.md
0 → 100644
View file @
11126991
## Training examples
### Flowers DDPM
The command to train a DDPM UNet model on the Oxford Flowers dataset:
```
bash
python
-m
torch.distributed.launch
\
--nproc_per_node
4
\
train_ddpm.py
\
--dataset
=
"huggan/flowers-102-categories"
\
--resolution
=
64
\
--output_path
=
"flowers-ddpm"
\
--batch_size
=
16
\
--num_epochs
=
100
\
--gradient_accumulation_steps
=
1
\
--lr
=
1e-4
\
--warmup_steps
=
500
\
--mixed_precision
=
no
```
A full ltraining run takes 2 hours on 4xV100 GPUs.
<img
src=
"https://user-images.githubusercontent.com/26864830/173855866-5628989f-856b-4725-a944-d6c09490b2df.png"
width=
"500"
/>
### Pokemon DDPM
The command to train a DDPM UNet model on the Pokemon dataset:
```
bash
python
-m
torch.distributed.launch
\
--nproc_per_node
4
\
train_ddpm.py
\
--dataset
=
"huggan/pokemon"
\
--resolution
=
64
\
--output_path
=
"flowers-ddpm"
\
--batch_size
=
16
\
--num_epochs
=
100
\
--gradient_accumulation_steps
=
1
\
--lr
=
1e-4
\
--warmup_steps
=
500
\
--mixed_precision
=
no
```
A full ltraining run takes 2 hours on 4xV100 GPUs.
<img
src=
"https://user-images.githubusercontent.com/26864830/173856733-4f117f8c-97bd-4f51-8002-56b488c96df9.png"
width=
"500"
/>
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment