- 13 Dec, 2022 1 commit
-
-
Suvaditya Mukherjee authored
* Added Community pipeline for comparing Stable Diffusion v1.1-4 Signed-off-by:
Suvaditya Mukherjee <suvadityamuk@gmail.com> * Made changes to provide support for current iteration of from_pretrained and added example Signed-off-by:
Suvaditya Mukherjee <suvadityamuk@gmail.com> * updated a small spelling error Signed-off-by:
Suvaditya Mukherjee <suvadityamuk@gmail.com> * added pipeline entry to table Signed-off-by:
Suvaditya Mukherjee <suvadityamuk@gmail.com> Signed-off-by:
Suvaditya Mukherjee <suvadityamuk@gmail.com>
-
- 12 Dec, 2022 2 commits
-
-
Patrick von Platen authored
-
Prathik Rao authored
bug fix Co-authored-by:
Prathik Rao <prathikrao@microsoft.com> Co-authored-by:
anton- <anton@huggingface.co>
-
- 10 Dec, 2022 2 commits
-
-
Tim Hinderliter authored
dreambooth: fix #1566: maintain fp32 wrapper when saving a checkpoint to avoid crash when running fp16 (#1618) * dreambooth: fix #1566: maintain fp32 wrapper when saving a checkpoint to avoid crash when running fp16 * dreambooth: guard against passing keep_fp32_wrapper arg to older versions of accelerate. part of fix for #1566 * Apply suggestions from code review Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> * Update examples/dreambooth/train_dreambooth.py Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> Co-authored-by:
Pedro Cuenca <pedro@huggingface.co>
-
Pedro Cuenca authored
Remove spurious arg in training scripts.
-
- 09 Dec, 2022 3 commits
-
-
Patrick von Platen authored
* do not automatically enable xformers * uP
-
Haofan Wang authored
* Update requirements.txt * Update requirements_flax.txt * Update requirements.txt * Update requirements_flax.txt * Update requirements.txt * Update requirements_flax.txt
-
SkyTNT authored
-
- 08 Dec, 2022 1 commit
-
-
Patrick von Platen authored
* uP * uP
-
- 07 Dec, 2022 3 commits
-
-
Ben Sherman authored
easy fix for undefined name in train_dreambooth.py import_model_class_from_model_name_or_path loads a pretrained model and refers to args.revision in a context where args is undefined. I modified the function to take revision as an argument and modified the invocation of the function to pass in the revision from args. Seems like this was caused by a cut and paste.
-
Nathan Lambert authored
* init docs update * style * fix bad colab formatting, add pipeline comment * update todo
-
SkyTNT authored
* fix lpw_stable_diffusion * rollback preprocess_mask resample
-
- 06 Dec, 2022 4 commits
-
-
Suraj Patil authored
make collate_fn global
-
Suraj Patil authored
* add check_min_version for examples * move __version__ to the top * Apply suggestions from code review Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> * fix comment * fix error_message * adapt the install message Co-authored-by:
Pedro Cuenca <pedro@huggingface.co>
-
Patrick von Platen authored
* Mega community pipeline * fix
-
Will Berman authored
-
- 05 Dec, 2022 5 commits
-
-
Patrick von Platen authored
* Research folder * Update examples/research_projects/README.md * up
-
Adalberto authored
The mask and instance image were being cropped in different ways without --center_crop, causing the model to learn to ignore the mask in some cases. This PR fixes that and generate more consistent results.
-
Suraj Patil authored
us from_pretrained to load scheduler
-
allo- authored
[textual_inversion] Add an option to only save embeddings Add an command line option --only_save_embeds to the example script, for not saving the full model. Then only the learned embeddings are saved, which can be added to the original model at runtime in a similar way as they are created in the training script. Saving the full model is forced when --push_to_hub is used. (Implements #759)
-
Naga Sai Abhinay authored
* Add checkpoint_merger pipeline * Added missing docs for a parameter. * Fomratting fixes. * Fixed code quality issues. * Bug fix: Off by 1 index * Added docs for pipeline
-
- 02 Dec, 2022 7 commits
-
-
Adalberto authored
* Create train_dreambooth_inpaint.py train_dreambooth.py adapted to work with the inpaint model, generating random masks during the training * Update train_dreambooth_inpaint.py refactored train_dreambooth_inpaint with black * Update train_dreambooth_inpaint.py * Update train_dreambooth_inpaint.py * Update train_dreambooth_inpaint.py Fix prior preservation * add instructions to readme, fix SD2 compatibility
-
Patrick von Platen authored
* up * up * finish * finish * up * up * finish
-
Pedro Gabriel Gengo Lourenço authored
Fixed doc to install from training packages
-
Dhruv Naik authored
fix typo, remove incorrect arguments from .train()
-
Benjamin Lefaudeux authored
* Moving the mem efficiient attention activation to the top + recursive * black, too bad there's no pre-commit ? Co-authored-by:Benjamin Lefaudeux <benjamin@photoroom.com>
-
Will Berman authored
-
Will Berman authored
-
- 01 Dec, 2022 2 commits
-
-
fboulnois authored
* feat: switch core pipelines to use image arg * test: update tests for core pipelines * feat: switch examples to use image arg * docs: update docs to use image arg * style: format code using black and doc-builder * fix: deprecate use of init_image in all pipelines
-
Anton Lozhkov authored
* Replace deprecated hub utils in `train_unconditional_ort` * typo
-
- 30 Nov, 2022 2 commits
-
-
Anton Lozhkov authored
-
Patrick von Platen authored
* [Dreambooth] Make compatible with alt diffusion * make style * add example
-
- 29 Nov, 2022 2 commits
-
-
Anton Lozhkov authored
-
Alex McKinney authored
* updates img2img_inpainting README * Adds example image to community pipeline README
-
- 28 Nov, 2022 1 commit
-
-
Suraj Patil authored
* add get_velocity * add v prediction for training * fix saving * add revision arg * fix saving * save checkpoints dreambooth * fix saving embeds * add instruction in readme * quality * noise_pred -> model_pred
-
- 25 Nov, 2022 3 commits
-
-
Pedro Cuenca authored
* Adapt ddpm, ddpmsolver to prediction_type. * Deprecate predict_epsilon in __init__. * Bring FlaxDDIMScheduler up to date with DDIMScheduler. * Set prediction_type as an ivar for consistency. * Convert pipeline_ddpm * Adapt tests. * Adapt unconditional training script. * Adapt BitDiffusion example. * Add missing kwargs in dpmsolver_multistep * Ugly workaround to accept deprecated predict_epsilon when loading schedulers using from_pretrained. * make style * Remove import no longer in use. * Apply suggestions from code review Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> * Use config.prediction_type everywhere * Add a couple of Flax prediction type tests. * make style * fix register deprecated arg Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com>
-
Patrick von Platen authored
* up * uP
-
Patrick von Platen authored
-
- 22 Nov, 2022 2 commits
-
-
regisss authored
-
Suraj Patil authored
* use accelerator to check mixed_precision * default `mixed_precision` to `None` * pass mixed_precision to accelerate launch
-