- 18 Jan, 2023 1 commit
-
-
Patrick von Platen authored
* [Lora] first upload * add first lora version * upload * more * first training * up * correct * improve * finish loaders and inference * up * up * fix more * up * finish more * finish more * up * up * change year * revert year change * Change lines * Add cloneofsimo as co-author. Co-authored-by:
Simo Ryu <cloneofsimo@gmail.com> * finish * fix docs * Apply suggestions from code review Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> Co-authored-by:
Suraj Patil <surajp815@gmail.com> * upload * finish Co-authored-by:
Simo Ryu <cloneofsimo@gmail.com> Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> Co-authored-by:
Suraj Patil <surajp815@gmail.com>
-
- 16 Jan, 2023 1 commit
-
-
Patrick von Platen authored
-
- 05 Jan, 2023 1 commit
-
-
Will Berman authored
* [dreambooth] low precision guard * fix * add docs to cli args * Update examples/dreambooth/train_dreambooth.py Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> * style Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com>
-
- 02 Jan, 2023 2 commits
-
-
Pedro Cuenca authored
Fixes to the help for report_to in training scripts.
-
Suraj Patil authored
* misc fixes * more comments * Update examples/textual_inversion/textual_inversion.py Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> * set transformers verbosity to warning Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com>
-
- 30 Dec, 2022 1 commit
-
-
Suraj Patil authored
update loss computation
-
- 27 Dec, 2022 2 commits
-
-
kabachuha authored
* allow selecting precision to make DB class images addresses #1831 * add prior_generation_precision argument * correct prior_generation_precision's description Co-authored-by:
Suraj Patil <surajp815@gmail.com> Co-authored-by:
Suraj Patil <surajp815@gmail.com>
-
Katsuya authored
* Make xformers optional even if it is available * Raise exception if xformers is used but not available * Rename use_xformers to enable_xformers_memory_efficient_attention * Add a note about xformers in README * Reformat code style
-
- 20 Dec, 2022 1 commit
-
-
Emil Bogomolov authored
* expose polynomial:power and cosine_with_restarts:num_cycles using get_scheduler func, add it to train_dreambooth.py * fix formatting * fix style * Update src/diffusers/optimization.py Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> Co-authored-by:
Pedro Cuenca <pedro@huggingface.co>
-
- 15 Dec, 2022 1 commit
-
-
Pedro Cuenca authored
* Add state checkpointing to other training scripts * Fix first_epoch * Apply suggestions from code review Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> * Update Dreambooth checkpoint help message. * Dreambooth docs: checkpoints, inference from a checkpoint. * make style Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com>
-
- 13 Dec, 2022 2 commits
-
-
Pedro Cuenca authored
Use warnings instead of logger in parse_args() logger requires an `Accelerator`.
-
Pedro Cuenca authored
* Dreambooth: save / restore training state. * make style * Rename vars for clarity. Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> * Remove unused import Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com>
-
- 10 Dec, 2022 2 commits
-
-
Tim Hinderliter authored
dreambooth: fix #1566: maintain fp32 wrapper when saving a checkpoint to avoid crash when running fp16 (#1618) * dreambooth: fix #1566: maintain fp32 wrapper when saving a checkpoint to avoid crash when running fp16 * dreambooth: guard against passing keep_fp32_wrapper arg to older versions of accelerate. part of fix for #1566 * Apply suggestions from code review Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> * Update examples/dreambooth/train_dreambooth.py Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> Co-authored-by:
Pedro Cuenca <pedro@huggingface.co>
-
Pedro Cuenca authored
Remove spurious arg in training scripts.
-
- 09 Dec, 2022 1 commit
-
-
Patrick von Platen authored
* do not automatically enable xformers * uP
-
- 07 Dec, 2022 1 commit
-
-
Ben Sherman authored
easy fix for undefined name in train_dreambooth.py import_model_class_from_model_name_or_path loads a pretrained model and refers to args.revision in a context where args is undefined. I modified the function to take revision as an argument and modified the invocation of the function to pass in the revision from args. Seems like this was caused by a cut and paste.
-
- 06 Dec, 2022 2 commits
-
-
Suraj Patil authored
make collate_fn global
-
Suraj Patil authored
* add check_min_version for examples * move __version__ to the top * Apply suggestions from code review Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> * fix comment * fix error_message * adapt the install message Co-authored-by:
Pedro Cuenca <pedro@huggingface.co>
-
- 05 Dec, 2022 1 commit
-
-
Suraj Patil authored
us from_pretrained to load scheduler
-
- 02 Dec, 2022 1 commit
-
-
Will Berman authored
-
- 30 Nov, 2022 1 commit
-
-
Patrick von Platen authored
* [Dreambooth] Make compatible with alt diffusion * make style * add example
-
- 28 Nov, 2022 1 commit
-
-
Suraj Patil authored
* add get_velocity * add v prediction for training * fix saving * add revision arg * fix saving * save checkpoints dreambooth * fix saving embeds * add instruction in readme * quality * noise_pred -> model_pred
-
- 22 Nov, 2022 1 commit
-
-
Suraj Patil authored
* use accelerator to check mixed_precision * default `mixed_precision` to `None` * pass mixed_precision to accelerate launch
-
- 18 Nov, 2022 1 commit
-
-
Patrick von Platen authored
* [Examples] Correct path * uP
-
- 08 Nov, 2022 1 commit
-
-
Yuta Hayashibe authored
* Make errors for invalid options without "--with_prior_preservation" * Make --instance_prompt required * Removed needless check because --instance_data_dir is marked with required * Updated messages * Use logger.warning instead of raise errors Co-authored-by:Patrick von Platen <patrick.v.platen@gmail.com>
-
- 02 Nov, 2022 1 commit
-
-
Yuta Hayashibe authored
-
- 31 Oct, 2022 1 commit
-
-
Patrick von Platen authored
* [Better scheduler docs] Improve usage examples of schedulers * finish * fix warnings and add test * finish * more replacements * adapt fast tests hf token * correct more * Apply suggestions from code review Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> * Integrate compatibility with euler Co-authored-by:
Pedro Cuenca <pedro@huggingface.co>
-
- 27 Oct, 2022 3 commits
-
-
Suraj Patil authored
-
Duong A. Nguyen authored
Set train mode for text encoder
-
Suraj Patil authored
make input_args optional
-
- 26 Oct, 2022 2 commits
-
-
Brian Whicheloe authored
* Make training code usable by external scripts Add parameter inputs to training and argument parsing function to allow this script to be used by an external call. * Apply suggestions from code review Co-authored-by:Patrick von Platen <patrick.v.platen@gmail.com>
-
Simon Kirsten authored
-
- 25 Oct, 2022 1 commit
-
-
Yuta Hayashibe authored
* Add --pretrained_model_name_revision option to train_dreambooth.py * Renamed --pretrained_model_name_revision to --revision
-
- 20 Oct, 2022 2 commits
-
-
Hanusz Leszek authored
* Add an underscore to filename if it already exists * Use sha1sum hash instead of adding underscores
-
Suraj Patil authored
dont' use safety check when generating prior images
-
- 18 Oct, 2022 1 commit
-
-
Suraj Patil authored
* allow fine-tuning text encoder * fix a few things * update readme
-
- 13 Oct, 2022 1 commit
-
-
Anton Lozhkov authored
Fix dreambooth loss type with prior preservation
-
- 11 Oct, 2022 1 commit
-
-
spezialspezial authored
-
- 10 Oct, 2022 1 commit
-
-
Henrik Forstén authored
* Support deepspeed * Dreambooth DeepSpeed documentation * Remove unnecessary casts, documentation Due to recent commits some casts to half precision are not necessary anymore. Mention that DeepSpeed's version of Adam is about 2x faster. * Review comments
-
- 07 Oct, 2022 1 commit
-
-
YaYaB authored
* Fix push_to_hub for dreambooth and textual_inversion * Use repo.push_to_hub instead of push_to_hub
-