- 03 Jan, 2023 1 commit
-
-
Anton Lozhkov authored
* Add UnCLIPImageVariationPipeline to dummy imports * style
-
- 02 Jan, 2023 5 commits
-
-
YiYi Xu authored
* add a doc page for each pipeline under api/pipelines/stable_diffusion * add pipeline examples to docstrings * updated stable_diffusion_2 page * updated default markdown syntax to list methods based on https://github.com/huggingface/diffusers/pull/1870 * add function decorator Co-authored-by:
yiyixuxu <yixu@Yis-MacBook-Pro.lan> Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com>
-
Pedro Cuenca authored
Fixes to the help for report_to in training scripts.
-
Suraj Patil authored
* misc fixes * more comments * Update examples/textual_inversion/textual_inversion.py Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> * set transformers verbosity to warning Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com>
-
agizmo authored
Numpy 1.24 had removed the "float" scalar alias as it was depricated in v1.20. https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations https://numpy.org/devdocs/release/1.24.0-notes.html#expired-deprecations
-
Pedro Cuenca authored
Fix typo in train_dreambooth_inpaint.
-
- 01 Jan, 2023 1 commit
-
-
Patrick von Platen authored
* [Attention] Finish refactor attention file * correct more * fix * more fixes * correct * up
-
- 30 Dec, 2022 6 commits
-
-
Suraj Patil authored
* allow using non-ema weights for training * Apply suggestions from code review Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> * address more review comment * reorganise a few lines * always pad text to max_length to match original training * ifx collate_fn * remove unused code * don't prepare ema_unet, don't register lr scheduler * style * assert => ValueError * add allow_tf32 * set log level * fix comment Co-authored-by:
Pedro Cuenca <pedro@huggingface.co>
-
Suraj Patil authored
update loss computation
-
Patrick von Platen authored
* [Docs] Improve docs * up
-
Pedro Cuenca authored
* Fix ema decay and clarify nomenclature. * Rename var.
-
Patrick von Platen authored
[Unclip] Make sure text_embeddings & image_embeddings can directly be passed to enable interpolation tasks. (#1858) * [Unclip] Make sure latents can be reused * allow one to directly pass embeddings * up * make unclip for text work * finish allowing to pass embeddings * correct more * make style
-
Patrick von Platen authored
* move files a bit * more refactors * fix more * more fixes * fix more onnx * make style * upload * fix * up * fix more * up again * up * small fix * Update src/diffusers/__init__.py Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> * correct Co-authored-by:
Pedro Cuenca <pedro@huggingface.co>
-
- 29 Dec, 2022 4 commits
-
-
Simon Kirsten authored
* Flax: Add components function * Flax: Fix img2img and align with other pipeline * Flax: Fix PRNGKey type * Refactor strength to start_timestep * Fix preprocess images * Fix processed_images dimen * latents.shape -> latents_shape * Fix typo * Remove "static" comment * Remove unnecessary optional types in _generate * Apply doc-builder code style. Co-authored-by:Pedro Cuenca <pedro@huggingface.co>
-
Suraj Patil authored
Co-authored-by:Henrik Forstén <henrik.forsten@gmail.com> * update TI script * make flake happy * fix typo
-
Patrick von Platen authored
-
Patrick von Platen authored
* [Dtype] Align automatic dtype * up * up * fix * re-add accelerate
-
- 28 Dec, 2022 3 commits
-
-
Patrick von Platen authored
fix versatile
-
Partho authored
* initial * type hints * update scheduler type hint * add to README * add example generation to README * v -> mix_factor * load scheduler from pretrained
-
Will Berman authored
* unCLIP image variation * remove prior comment re: @pcuenca * stable diffusion -> unCLIP re: @pcuenca * add copy froms re: @patil-suraj
-
- 27 Dec, 2022 6 commits
-
-
kabachuha authored
* allow selecting precision to make DB class images addresses #1831 * add prior_generation_precision argument * correct prior_generation_precision's description Co-authored-by:
Suraj Patil <surajp815@gmail.com> Co-authored-by:
Suraj Patil <surajp815@gmail.com>
-
Katsuya authored
* Make xformers optional even if it is available * Raise exception if xformers is used but not available * Rename use_xformers to enable_xformers_memory_efficient_attention * Add a note about xformers in README * Reformat code style
-
Christopher Friesen authored
-
William Held authored
* Width was typod as weight * Run Black
-
Pedro Cuenca authored
-
camenduru authored
* Device to use (e.g. cpu, cuda:0, cuda:1, etc.) * "cuda" if torch.cuda.is_available() else "cpu"
-
- 25 Dec, 2022 1 commit
-
-
Pedro Cuenca authored
* Make safety_checker optional in more pipelines. * Remove inappropriate comment in inpaint pipeline. * InPaint Test: set feature_extractor to None. * Remove import * img2img test: set feature_extractor to None. * inpaint sd2 test: set feature_extractor to None. Co-authored-by:Suraj Patil <surajp815@gmail.com>
-
- 24 Dec, 2022 1 commit
-
-
Daquan Lin authored
Fix inconsistencies between code and comments in the function 'preprocess'
-
- 23 Dec, 2022 2 commits
-
-
Suraj Patil authored
* unwrap_model text encoder before accessing weights * fix another call * fix the right call
-
Patrick von Platen authored
* Remove hardcoded names from PT scripts * Apply suggestions from code review Co-authored-by:Suraj Patil <surajp815@gmail.com>
-
- 22 Dec, 2022 1 commit
-
-
Prathik Rao authored
* reorder model wrap * bug fix Co-authored-by:Prathik Rao <prathikrao@microsoft.com>
-
- 21 Dec, 2022 1 commit
-
-
Pedro Cuenca authored
Don't initialize Jax on startup.
-
- 20 Dec, 2022 8 commits
-
-
Patrick von Platen authored
* first proposal * rename * up * Apply suggestions from code review * better * up * finish * up * rename * correct versatile * up * up * up * up * fix * Apply suggestions from code review * make style * Apply suggestions from code review Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> * add error message Co-authored-by:
Pedro Cuenca <pedro@huggingface.co>
-
Dhruv Naik authored
* add flax img2img pipeline * update pipeline * black format file * remove argg from get_timesteps * update get_timesteps * fix bug: make use of timesteps for for_loop * black file * black, isort, flake8 * update docstring * update readme * update flax img2img readme * update sd pipeline init * Update src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion_img2img.py Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> * Update src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion_img2img.py Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> * update inits * revert change * update var name to image, typo * update readme * return new t_start instead of modified timestep * black format * isort files * update docs * fix-copies * update prng_seed typing Co-authored-by:
Pedro Cuenca <pedro@huggingface.co>
-
Suraj Patil authored
* use repeat_interleave * fix repeat * Trigger Build * don't install accelerate from main * install released accelrate for mps test * Remove additional accelerate installation from main. Co-authored-by:Pedro Cuenca <pedro@huggingface.co>
-
Pedro Cuenca authored
* Section header for in-painting, inference from checkpoint. * Inference: link to section to perform inference from checkpoint. * Move Dreambooth in-painting instructions to the proper place.
-
Patrick von Platen authored
* allow model download when no internet * up * make style
-
Simon Kirsten authored
* [Flax] Stateless schedulers, fixes and refactors * Remove scheduling_common_flax and some renames * Update src/diffusers/schedulers/scheduling_pndm_flax.py Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> Co-authored-by:
Pedro Cuenca <pedro@huggingface.co>
-
Emil Bogomolov authored
* expose polynomial:power and cosine_with_restarts:num_cycles using get_scheduler func, add it to train_dreambooth.py * fix formatting * fix style * Update src/diffusers/optimization.py Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> Co-authored-by:
Pedro Cuenca <pedro@huggingface.co>
-
Patrick von Platen authored
-