- 21 Sep, 2022 7 commits
-
-
Anton Lozhkov authored
-
Pedro Cuenca authored
* Optionally return state in from_config. Useful for Flax schedulers. * has_state is now a property, make check more strict. I don't check the class is `SchedulerMixin` to prevent circular dependencies. It should be enough that the class name starts with "Flax" the object declares it "has_state" and the "create_state" exists too. * Use state in pipeline from_pretrained. * Make style
-
Younes Belkada authored
replace `dropout_prob` by `dropout` in `vae`
-
Mishig Davaadorj authored
-
Mishig Davaadorj authored
-
Pedro Cuenca authored
* Fix typo in docstring. * Allow dtype to be overridden on model load. This may be a temporary solution until #567 is addressed. * Create latents in float32 The denoising loop always computes the next step in float32, so this would fail when using `bfloat16`.
-
Pedro Cuenca authored
Fix params replication when sing the dummy checker.
-
- 20 Sep, 2022 7 commits
-
-
Patrick von Platen authored
* [Flax] Fix unet and ddim scheduler * correct * finish
-
Mishig Davaadorj authored
* WIP: flax FlaxDiffusionPipeline & FlaxStableDiffusionPipeline * todo comment * Fix imports * Fix imports * add dummies * Fix empty init * make pipeline work * up * Use Flax schedulers (typing, docstring) * Wrap model imports inside availability checks. * more updates * make sure flax is not broken * make style * more fixes * up Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> Co-authored-by:
Pedro Cuenca <pedro@latenitesoft.com>
-
Suraj Patil authored
* use FlaxPreTrainedModel for flax safety module * fix name * fix one more * Apply suggestions from code review
-
Anton Lozhkov authored
* Add the K-LMS scheduler to the inpainting pipeline + tests * Remove redundant casts
-
Anton Lozhkov authored
* Fix BaseOutput initialization from dict * style * Simplify post-init, add tests * remove debug
-
Suraj Patil authored
* rename weights to align with PT * DiagonalGaussianDistribution => FlaxDiagonalGaussianDistribution * fix name
-
Younes Belkada authored
* first commit: - add `from_pt` argument in `from_pretrained` function - add `modeling_flax_pytorch_utils.py` file * small nit - fix a small nit - to not enter in the second if condition * major changes - modify FlaxUnet modules - first conversion script - more keys to be matched * keys match - now all keys match - change module names for correct matching - upsample module name changed * working v1 - test pass with atol and rtol= `4e-02` * replace unsued arg * make quality * add small docstring * add more comments - add TODO for embedding layers * small change - use `jnp.expand_dims` for converting `timesteps` in case it is a 0-dimensional array * add more conditions on conversion - add better test to check for keys conversion * make shapes consistent - output `img_w x img_h x n_channels` from the VAE * Revert "make shapes consistent" This reverts commit 4cad1aeb4aeb224402dad13c018a5d42e96267f6. * fix unet shape - channels first!
-
- 19 Sep, 2022 11 commits
-
-
Yuta Hayashibe authored
* Fix a setting bug * Fix typos * Reverted params to parms
-
Yih-Dar authored
* Fix CrossAttention._sliced_attention Co-authored-by:ydshieh <ydshieh@users.noreply.github.com>
-
Anton Lozhkov authored
* make_reports * add test utils * style * style
-
Patrick von Platen authored
-
Patrick von Platen authored
* [Flax] Add Vae * correct * Apply suggestions from code review Co-authored-by:
Suraj Patil <surajp815@gmail.com> * Finish Co-authored-by:
Suraj Patil <surajp815@gmail.com>
-
Yih-Dar authored
* Fix _upsample_2d Co-authored-by:ydshieh <ydshieh@users.noreply.github.com>
-
Pedro Cuenca authored
Flax: ignore dtype for configuration. This makes it possible to save models and configuration files.
-
Pedro Cuenca authored
* Starting to integrate safety checker. * Fix initialization of CLIPVisionConfig * Remove commented lines. * make style * Remove unused import * Pass dtype to modules Co-authored-by:
Suraj Patil <surajp815@gmail.com> * Pass dtype to modules Co-authored-by:
Suraj Patil <surajp815@gmail.com> Co-authored-by:
Suraj Patil <surajp815@gmail.com>
-
Kashif Rasul authored
* remove match_shape * ported fixes from #479 to flax * remove unused argument * typo * remove warnings
-
ydshieh authored
-
ydshieh authored
-
- 18 Sep, 2022 1 commit
-
-
Mishig Davaadorj authored
-
- 17 Sep, 2022 2 commits
-
-
Patrick von Platen authored
* [Config] improve logging * finish
-
Jonatan Kłosko authored
* Unify offset configuration in DDIM and PNDM schedulers * Format Add missing variables * Fix pipeline test * Update src/diffusers/schedulers/scheduling_ddim.py Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> * Default set_alpha_to_one to false * Format * Add tests * Format * add deprecation warning Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com>
-
- 16 Sep, 2022 9 commits
-
-
Patrick von Platen authored
-
Patrick von Platen authored
* [Download] Smart downloading * add test * finish test * update * make style
-
Patrick von Platen authored
Revert "adding more typehints to DDIM scheduler (#456)" This reverts commit a0558b11.
-
V Vishnu Anirudh authored
* adding more typehints * resolving mypy issues * resolving formatting issue * fixing isort issue Co-authored-by:
V Vishnu Anirudh <git.vva@gmail.com> Co-authored-by:
V Vishnu Anirudh <vvani@kth.se>
-
Suraj Patil authored
* accept tensors * fix mask handling * make device placement cleaner * update doc for mask image
-
Yuta Hayashibe authored
* Fix typos * Add a typo check action * Fix a bug * Changed to manual typo check currently Ref: https://github.com/huggingface/diffusers/pull/483#pullrequestreview-1104468010 Co-authored-by:
Anton Lozhkov <aglozhkov@gmail.com> * Removed a confusing message * Renamed "nin_shortcut" to "in_shortcut" * Add memo about NIN Co-authored-by:
Anton Lozhkov <aglozhkov@gmail.com>
-
Yih-Dar authored
* Fix PT up/down sample_2d * empty commit * style * style Co-authored-by:ydshieh <ydshieh@users.noreply.github.com>
-
Anton Lozhkov authored
* Finally fix the image-based SD tests * Remove autocast * Remove autocast in image tests
-
SkyTNT authored
* Fix is_onnx_available Fix: If user install onnxruntime-gpu, is_onnx_available() will return False. * add more onnxruntime candidates * Run `make style` Co-authored-by:anton-l <anton@huggingface.co>
-
- 15 Sep, 2022 3 commits
-
-
Pedro Cuenca authored
* First UNet Flax modeling blocks. Mimic the structure of the PyTorch files. The model classes themselves need work, depending on what we do about configuration and initialization. * Remove FlaxUNet2DConfig class. * ignore_for_config non-config args. * Implement `FlaxModelMixin` * Use new mixins for Flax UNet. For some reason the configuration is not correctly applied; the signature of the `__init__` method does not contain all the parameters by the time it's inspected in `extract_init_dict`. * Import `FlaxUNet2DConditionModel` if flax is available. * Rm unused method `framework` * Update src/diffusers/modeling_flax_utils.py Co-authored-by:
Suraj Patil <surajp815@gmail.com> * Indicate types in flax.struct.dataclass as pointed out by @mishig25 Co-authored-by:
Mishig Davaadorj <mishig.davaadorj@coloradocollege.edu> * Fix typo in transformer block. * make style * some more changes * make style * Add comment * Update src/diffusers/modeling_flax_utils.py Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> * Rm unneeded comment * Update docstrings * correct ignore kwargs * make style * Update docstring examples * Make style * Style: remove empty line. * Apply style (after upgrading black from pinned version) * Remove some commented code and unused imports. * Add init_weights (not yet in use until #513). * Trickle down deterministic to blocks. * Rename q, k, v according to the latest PyTorch version. Note that weights were exported with the old names, so we need to be careful. * Flax UNet docstrings, default props as in PyTorch. * Fix minor typos in PyTorch docstrings. * Use FlaxUNet2DConditionOutput as output from UNet. * make style Co-authored-by:
Mishig Davaadorj <dmishig@gmail.com> Co-authored-by:
Mishig Davaadorj <mishig.davaadorj@coloradocollege.edu> Co-authored-by:
Suraj Patil <surajp815@gmail.com> Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com>
-
Mishig Davaadorj authored
* Add `init_weights` method to `FlaxMixin` * Rn `random_state` -> `shape_state` * `PRNGKey(0)` for `jax.eval_shape` * No allow mismatched sizes * Update src/diffusers/modeling_flax_utils.py Co-authored-by:
Suraj Patil <surajp815@gmail.com> * Update src/diffusers/modeling_flax_utils.py Co-authored-by:
Suraj Patil <surajp815@gmail.com> * docstring diffusers Co-authored-by:
Suraj Patil <surajp815@gmail.com>
-
Suraj Patil authored
* pass norm_num_groups to unet blocs and attention * fix UNet2DConditionModel * add norm_num_groups arg in vae * add tests * remove comment * Apply suggestions from code review
-