- 17 Feb, 2023 3 commits
-
-
Will Berman authored
* add xformers 0.0.16 warning message * fix version check to check whole version string
-
Will Berman authored
This reverts commit 024c4376.
-
Patrick von Platen authored
-
- 16 Feb, 2023 2 commits
-
-
Will Berman authored
-
Will Berman authored
* add total number checkpoints to training scripts * Update examples/dreambooth/train_dreambooth.py Co-authored-by:
Sayak Paul <spsayakpaul@gmail.com> --------- Co-authored-by:
Sayak Paul <spsayakpaul@gmail.com>
-
- 13 Feb, 2023 1 commit
-
-
Will Berman authored
-
- 07 Feb, 2023 3 commits
-
-
Patrick von Platen authored
* before running make style * remove left overs from flake8 * finish * make fix-copies * final fix * more fixes
-
Patrick von Platen authored
* better accelerated saving * up * finish * finish * uP * up * up * fix * Apply suggestions from code review * correct ema * Remove @ * up * Apply suggestions from code review Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> * Update docs/source/en/training/dreambooth.mdx Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> --------- Co-authored-by:
Pedro Cuenca <pedro@huggingface.co>
-
Patrick von Platen authored
* [Examples] Remove datasets important that is not needed * remove from lora tambien
-
- 03 Feb, 2023 1 commit
-
-
Patrick von Platen authored
* [LoRA] Make sure validation works in multi GPU setup * more fixes * up
-
- 31 Jan, 2023 1 commit
-
-
hysts authored
-
- 27 Jan, 2023 1 commit
-
-
RahulBhalley authored
-
- 26 Jan, 2023 2 commits
-
-
hysts authored
Fix
-
Suraj Patil authored
* make scaling factor cnfig arg of vae * fix * make flake happy * fix ldm * fix upscaler * qualirty * Apply suggestions from code review Co-authored-by:
Anton Lozhkov <anton@huggingface.co> Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> * solve conflicts, addres some comments * examples * examples min version * doc * fix type * typo * Update src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_upscale.py Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> * remove duplicate line * Apply suggestions from code review Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> Co-authored-by:
Anton Lozhkov <anton@huggingface.co> Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com>
-
- 25 Jan, 2023 6 commits
-
-
Suraj Patil authored
check the dtype before preparing model
-
Patrick von Platen authored
* [Bump version] 0.13 * Bump model up * up
-
Oren WANG authored
-
Patrick von Platen authored
-
apolinario authored
* Add `lora` tag to the model tags For lora training * uP Co-authored-by:Patrick von Platen <patrick.v.platen@gmail.com>
-
patil-suraj authored
-
- 24 Jan, 2023 5 commits
-
-
Will Berman authored
* [docs] [dreambooth] note random crop documenting default random crop behavior
-
Yuta Hayashibe authored
-
Suraj Patil authored
unwrap model on multi gpu
-
Pedro Cuenca authored
-
Pedro Cuenca authored
* [lora] Log images when using tensorboard. * Specify image format instead of transposing. As discussed with @sayakpaul. * Style
-
- 23 Jan, 2023 2 commits
-
-
Gleb Akhmerov authored
* Dreambooth: use `optimizer.zero_grad(set_to_none=True)` to reduce VRAM usage * Allow the user to control `optimizer.zero_grad(set_to_none=True)` with --set_grads_to_none * Update Dreambooth readme * Fix link in readme * Fix header size in readme
-
Suraj Patil authored
add --dataloader_num_workers argument
-
- 20 Jan, 2023 3 commits
- 19 Jan, 2023 1 commit
-
-
Patrick von Platen authored
* [Lora] up lora training * finish * finish * finish model card
-
- 18 Jan, 2023 1 commit
-
-
Patrick von Platen authored
* [Lora] first upload * add first lora version * upload * more * first training * up * correct * improve * finish loaders and inference * up * up * fix more * up * finish more * finish more * up * up * change year * revert year change * Change lines * Add cloneofsimo as co-author. Co-authored-by:
Simo Ryu <cloneofsimo@gmail.com> * finish * fix docs * Apply suggestions from code review Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> Co-authored-by:
Suraj Patil <surajp815@gmail.com> * upload * finish Co-authored-by:
Simo Ryu <cloneofsimo@gmail.com> Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> Co-authored-by:
Suraj Patil <surajp815@gmail.com>
-
- 16 Jan, 2023 3 commits
-
-
Patrick von Platen authored
-
Pedro Cuenca authored
Fix a couple typos in Dreambooth readme.
-
Sayak Paul authored
* Update README.md Co-authored-by:Pedro Cuenca <pedro@huggingface.co>
-
- 05 Jan, 2023 1 commit
-
-
Will Berman authored
* [dreambooth] low precision guard * fix * add docs to cli args * Update examples/dreambooth/train_dreambooth.py Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> * style Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com>
-
- 04 Jan, 2023 3 commits
-
-
Patrick von Platen authored
-
Yasyf Mohamedali authored
* Various Fixes for Flax Dreambooth - Correctly update the progress bar every epoch - Allow specifying a pretrained VAE - Allow specifying a revision to pretrained models - Cache compiled models between invocations (speeds up TPU execution a lot!) - Save intermediate checkpoints by specifying `save_steps` * Don't die when save_steps is not set. * Address CR * Address comments * Apply suggestions from code review Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> Co-authored-by:
Suraj Patil <surajp815@gmail.com> Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> Co-authored-by:
Pedro Cuenca <pedro@huggingface.co>
-
Yasyf Mohamedali authored
* Support training SD V2 with Flax Mostly involves supporting a v_prediction scheduler. The implementation in #1777 doesn't take into account a recent refactor of `scheduling_utils_flax`, so this should be used instead. * Add to other top-level files.
-
- 02 Jan, 2023 1 commit
-
-
Pedro Cuenca authored
Fixes to the help for report_to in training scripts.
-