- 31 Jan, 2023 6 commits
-
-
Patrick von Platen authored
-
1lint authored
* fix legacy inpaint noise and resize mask tensor * updated legacy inpaint pipe test expected_slice
-
Sayak Paul authored
* Update README.md * Update README.md
-
Dudu Moshe authored
scheduling_ddpm: fix evaluate with lower timesteps count than train. Co-authored-by:Patrick von Platen <patrick.v.platen@gmail.com>
-
Patrick von Platen authored
-
hysts authored
-
- 30 Jan, 2023 4 commits
-
-
Pedro Cuenca authored
* Don't copy when unwrapping model. Otherwise an exception is raised when using fp16. * Remove unused import
-
Pedro Cuenca authored
* Section on using LoRA alpha / scale. * Accept suggestion Co-authored-by:
Sayak Paul <spsayakpaul@gmail.com> * Clarify on merge. --------- Co-authored-by:
Sayak Paul <spsayakpaul@gmail.com>
-
Patrick von Platen authored
* finish more * finish philosophy * Apply suggestions from code review Co-authored-by:
YiYi Xu <yixu310@gmail.com> Co-authored-by:
Will Berman <wlbberman@gmail.com> --------- Co-authored-by:
YiYi Xu <yixu310@gmail.com> Co-authored-by:
Will Berman <wlbberman@gmail.com>
-
Pedro Cuenca authored
Fix typo in accelerate and transformers versions.
-
- 29 Jan, 2023 1 commit
-
- 27 Jan, 2023 9 commits
-
-
Pedro Cuenca authored
-
Nicolas Patry authored
* Tmp. * Adding more docs. * Doc style. * Remove the argument `use_safetensors=True`. * doc-builder
-
Will Berman authored
-
Patrick von Platen authored
-
Patrick von Platen authored
-
Patrick von Platen authored
Don't call the Hub if
-
RahulBhalley authored
-
Ji soo Kim authored
Fix typo in loaders.py
-
Patrick von Platen authored
* [LoRA] All to use in inference with pipeline * [LoRA] allow cross attention kwargs passed to pipeline * finish
-
- 26 Jan, 2023 10 commits
-
-
Will Berman authored
-
Patrick von Platen authored
-
Will Berman authored
-
hysts authored
Fix
-
Will Berman authored
* fuse attention mask * lint * use 0 beta when no attention mask re: @Birch-san
-
Cyberes authored
* fix PosixPath is not JSON serializable * use PosixPath * forgot elif like a dummy
-
Patrick von Platen authored
-
Suraj Patil authored
* make scaling factor cnfig arg of vae * fix * make flake happy * fix ldm * fix upscaler * qualirty * Apply suggestions from code review Co-authored-by:
Anton Lozhkov <anton@huggingface.co> Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> * solve conflicts, addres some comments * examples * examples min version * doc * fix type * typo * Update src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_upscale.py Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> * remove duplicate line * Apply suggestions from code review Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> Co-authored-by:
Anton Lozhkov <anton@huggingface.co> Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com>
-
Pedro Cuenca authored
* Allow `UNet2DModel` to use arbitrary class embeddings. We can currently use class conditioning in `UNet2DConditionModel`, but not in `UNet2DModel`. However, `UNet2DConditionModel` requires text conditioning too, which is unrelated to other types of conditioning. This commit makes it possible for `UNet2DModel` to be conditioned on entities other than timesteps. This is useful for training / research purposes. We can currently train models to perform unconditional image generation or text-to-image generation, but it's not straightforward to train a model to perform class-conditioned image generation, if text conditioning is not required. We could potentiall use `UNet2DConditionModel` for class-conditioning without text embeddings by using down/up blocks without cross-conditioning. However: - The mid block currently requires cross attention. - We are required to provide `encoder_hidden_states` to `forward`. * Style * Align class conditioning, add docstring for `num_class_embeds`. * Copy docstring to versatile_diffusion UNetFlatConditionModel
-
Pedro Cuenca authored
* [textual inversion] Allow validation images. * Change key to `validation` * Specify format instead of transposing. As discussed with @sayakpaul. * Style Co-authored-by:isamu-isozaki <isamu.website@gmail.com>
-
- 25 Jan, 2023 10 commits
-
-
Suraj Patil authored
check the dtype before preparing model
-
Patrick von Platen authored
* [Bump version] 0.13 * Bump model up * up
-
Patrick von Platen authored
-
Oren WANG authored
-
Patrick von Platen authored
-
Patrick von Platen authored
* make tests deterministic * run slow tests * prepare for testing * finish * refactor * add print statements * finish more * correct some test failures * more fixes * set up to correct tests * more corrections * up * fix more * more prints * add * up * up * up * uP * uP * more fixes * uP * up * up * up * up * fix more * up * up * clean tests * up * up * up * more fixes * Apply suggestions from code review Co-authored-by:
Suraj Patil <surajp815@gmail.com> * make * correct * finish * finish Co-authored-by:
Suraj Patil <surajp815@gmail.com>
-
Patrick von Platen authored
-
Patrick von Platen authored
* add text embeds to sd * add text embeds to sd * finish tests * finish * finish * make style * fix tests * make style * make style * up * better docs * fix * fix * new try * up * up * finish
-
Sayak Paul authored
* add: a doc on LoRA support in diffusers. * Apply suggestions from code review Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> * apply PR suggestions. * Apply suggestions from code review Co-authored-by:
Pedro Cuenca <pedro@huggingface.co> * remove visually incoherent elements. Co-authored-by:
Pedro Cuenca <pedro@huggingface.co>
-
apolinario authored
* Add `lora` tag to the model tags For lora training * uP Co-authored-by:Patrick von Platen <patrick.v.platen@gmail.com>
-