"examples/vscode:/vscode.git/clone" did not exist on "43927851378b0cd3b3f68699262a1e6d887e5f08"
  1. 19 Sep, 2022 11 commits
  2. 18 Sep, 2022 1 commit
  3. 17 Sep, 2022 2 commits
  4. 16 Sep, 2022 9 commits
  5. 15 Sep, 2022 4 commits
    • Pedro Cuenca's avatar
      UNet Flax with FlaxModelMixin (#502) · d8b0e4f4
      Pedro Cuenca authored
      
      
      * First UNet Flax modeling blocks.
      
      Mimic the structure of the PyTorch files.
      The model classes themselves need work, depending on what we do about
      configuration and initialization.
      
      * Remove FlaxUNet2DConfig class.
      
      * ignore_for_config non-config args.
      
      * Implement `FlaxModelMixin`
      
      * Use new mixins for Flax UNet.
      
      For some reason the configuration is not correctly applied; the
      signature of the `__init__` method does not contain all the parameters
      by the time it's inspected in `extract_init_dict`.
      
      * Import `FlaxUNet2DConditionModel` if flax is available.
      
      * Rm unused method `framework`
      
      * Update src/diffusers/modeling_flax_utils.py
      Co-authored-by: default avatarSuraj Patil <surajp815@gmail.com>
      
      * Indicate types in flax.struct.dataclass as pointed out by @mishig25
      Co-authored-by: default avatarMishig Davaadorj <mishig.davaadorj@coloradocollege.edu>
      
      * Fix typo in transformer block.
      
      * make style
      
      * some more changes
      
      * make style
      
      * Add comment
      
      * Update src/diffusers/modeling_flax_utils.py
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      
      * Rm unneeded comment
      
      * Update docstrings
      
      * correct ignore kwargs
      
      * make style
      
      * Update docstring examples
      
      * Make style
      
      * Style: remove empty line.
      
      * Apply style (after upgrading black from pinned version)
      
      * Remove some commented code and unused imports.
      
      * Add init_weights (not yet in use until #513).
      
      * Trickle down deterministic to blocks.
      
      * Rename q, k, v according to the latest PyTorch version.
      
      Note that weights were exported with the old names, so we need to be
      careful.
      
      * Flax UNet docstrings, default props as in PyTorch.
      
      * Fix minor typos in PyTorch docstrings.
      
      * Use FlaxUNet2DConditionOutput as output from UNet.
      
      * make style
      Co-authored-by: default avatarMishig Davaadorj <dmishig@gmail.com>
      Co-authored-by: default avatarMishig Davaadorj <mishig.davaadorj@coloradocollege.edu>
      Co-authored-by: default avatarSuraj Patil <surajp815@gmail.com>
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      d8b0e4f4
    • Mishig Davaadorj's avatar
      Add `init_weights` method to `FlaxMixin` (#513) · fb5468a6
      Mishig Davaadorj authored
      
      
      * Add `init_weights` method to `FlaxMixin`
      
      * Rn `random_state` -> `shape_state`
      
      * `PRNGKey(0)` for `jax.eval_shape`
      
      * No allow mismatched sizes
      
      * Update src/diffusers/modeling_flax_utils.py
      Co-authored-by: default avatarSuraj Patil <surajp815@gmail.com>
      
      * Update src/diffusers/modeling_flax_utils.py
      Co-authored-by: default avatarSuraj Patil <surajp815@gmail.com>
      
      * docstring diffusers
      Co-authored-by: default avatarSuraj Patil <surajp815@gmail.com>
      fb5468a6
    • Suraj Patil's avatar
      [UNet2DConditionModel, UNet2DModel] pass norm_num_groups to all the blocks (#442) · d144c46a
      Suraj Patil authored
      * pass norm_num_groups to unet blocs and attention
      
      * fix UNet2DConditionModel
      
      * add norm_num_groups arg in vae
      
      * add tests
      
      * remove comment
      
      * Apply suggestions from code review
      d144c46a
    • Kashif Rasul's avatar
      Karras VE, DDIM and DDPM flax schedulers (#508) · b34be039
      Kashif Rasul authored
      * beta never changes removed from state
      
      * fix typos in docs
      
      * removed unused var
      
      * initial ddim flax scheduler
      
      * import
      
      * added dummy objects
      
      * fix style
      
      * fix typo
      
      * docs
      
      * fix typo in comment
      
      * set return type
      
      * added flax ddom
      
      * fix style
      
      * remake
      
      * pass PRNG key as argument and split before use
      
      * fix doc string
      
      * use config
      
      * added flax Karras VE scheduler
      
      * make style
      
      * fix dummy
      
      * fix ndarray type annotation
      
      * replace returns a new state
      
      * added lms_discrete scheduler
      
      * use self.config
      
      * add_noise needs state
      
      * use config
      
      * use config
      
      * docstring
      
      * added flax score sde ve
      
      * fix imports
      
      * fix typos
      b34be039
  6. 14 Sep, 2022 4 commits
  7. 13 Sep, 2022 6 commits
  8. 12 Sep, 2022 1 commit
    • Kashif Rasul's avatar
      update expected results of slow tests (#268) · f4781a0b
      Kashif Rasul authored
      
      
      * update expected results of slow tests
      
      * relax sum and mean tests
      
      * Print shapes when reporting exception
      
      * formatting
      
      * fix sentence
      
      * relax test_stable_diffusion_fast_ddim for gpu fp16
      
      * relax flakey tests on GPU
      
      * added comment on large tolerences
      
      * black
      
      * format
      
      * set scheduler seed
      
      * added generator
      
      * use np.isclose
      
      * set num_inference_steps to 50
      
      * fix dep. warning
      
      * update expected_slice
      
      * preprocess if image
      
      * updated expected results
      
      * updated expected from CI
      
      * pass generator to VAE
      
      * undo change back to orig
      
      * use orignal
      
      * revert back the expected on cpu
      
      * revert back values for CPU
      
      * more undo
      
      * update result after using gen
      
      * update mean
      
      * set generator for mps
      
      * update expected on CI server
      
      * undo
      
      * use new seed every time
      
      * cpu manual seed
      
      * reduce num_inference_steps
      
      * style
      
      * use generator for randn
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      f4781a0b
  9. 09 Sep, 2022 2 commits