- 17 Feb, 2023 3 commits
-
-
Will Berman authored
* add xformers 0.0.16 warning message * fix version check to check whole version string
-
Will Berman authored
This reverts commit 024c4376.
-
Patrick von Platen authored
-
- 16 Feb, 2023 2 commits
-
-
Will Berman authored
-
Will Berman authored
* add total number checkpoints to training scripts * Update examples/dreambooth/train_dreambooth.py Co-authored-by:
Sayak Paul <spsayakpaul@gmail.com> --------- Co-authored-by:
Sayak Paul <spsayakpaul@gmail.com>
-
- 07 Feb, 2023 2 commits
-
-
Patrick von Platen authored
* before running make style * remove left overs from flake8 * finish * make fix-copies * final fix * more fixes
-
Patrick von Platen authored
* [Examples] Remove datasets important that is not needed * remove from lora tambien
-
- 03 Feb, 2023 1 commit
-
-
Patrick von Platen authored
* [LoRA] Make sure validation works in multi GPU setup * more fixes * up
-
- 31 Jan, 2023 1 commit
-
-
hysts authored
-
- 26 Jan, 2023 2 commits
-
-
hysts authored
Fix
-
Suraj Patil authored
* make scaling factor cnfig arg of vae * fix * make flake happy * fix ldm * fix upscaler * qualirty * Apply suggestions from code review Co-authored-by:
Anton Lozhkov <anton@huggingface.co> Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> * solve conflicts, addres some comments * examples * examples min version * doc * fix type * typo * Update src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_upscale.py Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> * remove duplicate line * Apply suggestions from code review Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> Co-authored-by:
Anton Lozhkov <anton@huggingface.co> Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com>
-
- 25 Jan, 2023 5 commits
-
-
Patrick von Platen authored
* [Bump version] 0.13 * Bump model up * up
-
Oren WANG authored
-
Patrick von Platen authored
-
apolinario authored
* Add `lora` tag to the model tags For lora training * uP Co-authored-by:Patrick von Platen <patrick.v.platen@gmail.com>
-
patil-suraj authored
-
- 24 Jan, 2023 3 commits
-
-
Yuta Hayashibe authored
-
Pedro Cuenca authored
-
Pedro Cuenca authored
* [lora] Log images when using tensorboard. * Specify image format instead of transposing. As discussed with @sayakpaul. * Style
-
- 23 Jan, 2023 1 commit
-
-
Suraj Patil authored
add --dataloader_num_workers argument
-
- 20 Jan, 2023 1 commit
-
-
Lucain authored
* Create repo before cloning in examples * code quality
-
- 19 Jan, 2023 1 commit
-
-
Patrick von Platen authored
* [Lora] up lora training * finish * finish * finish model card
-
- 18 Jan, 2023 1 commit
-
-
Patrick von Platen authored
* [Lora] first upload * add first lora version * upload * more * first training * up * correct * improve * finish loaders and inference * up * up * fix more * up * finish more * finish more * up * up * change year * revert year change * Change lines * Add cloneofsimo as co-author. Co-authored-by:
Simo Ryu <cloneofsimo@gmail.com> * finish * fix docs * Apply suggestions from code review Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> Co-authored-by:
Suraj Patil <surajp815@gmail.com> * upload * finish Co-authored-by:
Simo Ryu <cloneofsimo@gmail.com> Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> Co-authored-by:
Suraj Patil <surajp815@gmail.com>
-
- 16 Jan, 2023 1 commit
-
-
Patrick von Platen authored
-
- 05 Jan, 2023 1 commit
-
-
Will Berman authored
* [dreambooth] low precision guard * fix * add docs to cli args * Update examples/dreambooth/train_dreambooth.py Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> * style Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com>
-
- 02 Jan, 2023 2 commits
-
-
Pedro Cuenca authored
Fixes to the help for report_to in training scripts.
-
Suraj Patil authored
* misc fixes * more comments * Update examples/textual_inversion/textual_inversion.py Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> * set transformers verbosity to warning Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com>
-
- 30 Dec, 2022 1 commit
-
-
Suraj Patil authored
update loss computation
-
- 27 Dec, 2022 2 commits
-
-
kabachuha authored
* allow selecting precision to make DB class images addresses #1831 * add prior_generation_precision argument * correct prior_generation_precision's description Co-authored-by:
Suraj Patil <surajp815@gmail.com> Co-authored-by:
Suraj Patil <surajp815@gmail.com>
-
Katsuya authored
* Make xformers optional even if it is available * Raise exception if xformers is used but not available * Rename use_xformers to enable_xformers_memory_efficient_attention * Add a note about xformers in README * Reformat code style
-
- 20 Dec, 2022 1 commit
-
-
Emil Bogomolov authored
* expose polynomial:power and cosine_with_restarts:num_cycles using get_scheduler func, add it to train_dreambooth.py * fix formatting * fix style * Update src/diffusers/optimization.py Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> Co-authored-by:
Pedro Cuenca <pedro@huggingface.co>
-
- 15 Dec, 2022 1 commit
-
-
Pedro Cuenca authored
* Add state checkpointing to other training scripts * Fix first_epoch * Apply suggestions from code review Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> * Update Dreambooth checkpoint help message. * Dreambooth docs: checkpoints, inference from a checkpoint. * make style Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com>
-
- 13 Dec, 2022 2 commits
-
-
Pedro Cuenca authored
Use warnings instead of logger in parse_args() logger requires an `Accelerator`.
-
Pedro Cuenca authored
* Dreambooth: save / restore training state. * make style * Rename vars for clarity. Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> * Remove unused import Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com>
-
- 10 Dec, 2022 2 commits
-
-
Tim Hinderliter authored
dreambooth: fix #1566: maintain fp32 wrapper when saving a checkpoint to avoid crash when running fp16 (#1618) * dreambooth: fix #1566: maintain fp32 wrapper when saving a checkpoint to avoid crash when running fp16 * dreambooth: guard against passing keep_fp32_wrapper arg to older versions of accelerate. part of fix for #1566 * Apply suggestions from code review Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> * Update examples/dreambooth/train_dreambooth.py Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> Co-authored-by:
Pedro Cuenca <pedro@huggingface.co>
-
Pedro Cuenca authored
Remove spurious arg in training scripts.
-
- 09 Dec, 2022 1 commit
-
-
Patrick von Platen authored
* do not automatically enable xformers * uP
-
- 07 Dec, 2022 1 commit
-
-
Ben Sherman authored
easy fix for undefined name in train_dreambooth.py import_model_class_from_model_name_or_path loads a pretrained model and refers to args.revision in a context where args is undefined. I modified the function to take revision as an argument and modified the invocation of the function to pass in the revision from args. Seems like this was caused by a cut and paste.
-
- 06 Dec, 2022 2 commits
-
-
Suraj Patil authored
make collate_fn global
-
Suraj Patil authored
* add check_min_version for examples * move __version__ to the top * Apply suggestions from code review Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> * fix comment * fix error_message * adapt the install message Co-authored-by:
Pedro Cuenca <pedro@huggingface.co>
-