"vscode:/vscode.git/clone" did not exist on "452339e20eb02ed0fc79fec6cafd38141219a187"
  1. 24 Oct, 2022 1 commit
  2. 20 Oct, 2022 2 commits
  3. 19 Oct, 2022 2 commits
  4. 18 Oct, 2022 1 commit
  5. 14 Oct, 2022 1 commit
  6. 13 Oct, 2022 1 commit
  7. 12 Oct, 2022 1 commit
  8. 11 Oct, 2022 1 commit
    • Akash Pannu's avatar
      Flax: Trickle down `norm_num_groups` (#789) · a1242044
      Akash Pannu authored
      * pass norm_num_groups param and add tests
      
      * set resnet_groups for FlaxUNetMidBlock2D
      
      * fixed docstrings
      
      * fixed typo
      
      * using is_flax_available util and created require_flax decorator
      a1242044
  9. 06 Oct, 2022 1 commit
  10. 05 Oct, 2022 1 commit
  11. 04 Oct, 2022 2 commits
    • Pedro Cuenca's avatar
      Fix import if PyTorch is not installed (#715) · 215bb408
      Pedro Cuenca authored
      * Fix import if PyTorch is not installed.
      
      * Style (blank line)
      215bb408
    • Pi Esposito's avatar
      add accelerate to load models with smaller memory footprint (#361) · 4d1cce2f
      Pi Esposito authored
      
      
      * add accelerate to load models with smaller memory footprint
      
      * remove low_cpu_mem_usage as it is reduntant
      
      * move accelerate init weights context to modelling utils
      
      * add test to ensure results are the same when loading with accelerate
      
      * add tests to ensure ram usage gets lower when using accelerate
      
      * move accelerate logic to single snippet under modelling utils and remove it from configuration utils
      
      * format code using to pass quality check
      
      * fix imports with isor
      
      * add accelerate to test extra deps
      
      * only import accelerate if device_map is set to auto
      
      * move accelerate availability check to diffusers import utils
      
      * format code
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      4d1cce2f
  12. 03 Oct, 2022 1 commit
  13. 24 Sep, 2022 1 commit
  14. 21 Sep, 2022 1 commit
  15. 20 Sep, 2022 2 commits
  16. 16 Sep, 2022 1 commit
  17. 15 Sep, 2022 2 commits
    • Pedro Cuenca's avatar
      UNet Flax with FlaxModelMixin (#502) · d8b0e4f4
      Pedro Cuenca authored
      
      
      * First UNet Flax modeling blocks.
      
      Mimic the structure of the PyTorch files.
      The model classes themselves need work, depending on what we do about
      configuration and initialization.
      
      * Remove FlaxUNet2DConfig class.
      
      * ignore_for_config non-config args.
      
      * Implement `FlaxModelMixin`
      
      * Use new mixins for Flax UNet.
      
      For some reason the configuration is not correctly applied; the
      signature of the `__init__` method does not contain all the parameters
      by the time it's inspected in `extract_init_dict`.
      
      * Import `FlaxUNet2DConditionModel` if flax is available.
      
      * Rm unused method `framework`
      
      * Update src/diffusers/modeling_flax_utils.py
      Co-authored-by: default avatarSuraj Patil <surajp815@gmail.com>
      
      * Indicate types in flax.struct.dataclass as pointed out by @mishig25
      Co-authored-by: default avatarMishig Davaadorj <mishig.davaadorj@coloradocollege.edu>
      
      * Fix typo in transformer block.
      
      * make style
      
      * some more changes
      
      * make style
      
      * Add comment
      
      * Update src/diffusers/modeling_flax_utils.py
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      
      * Rm unneeded comment
      
      * Update docstrings
      
      * correct ignore kwargs
      
      * make style
      
      * Update docstring examples
      
      * Make style
      
      * Style: remove empty line.
      
      * Apply style (after upgrading black from pinned version)
      
      * Remove some commented code and unused imports.
      
      * Add init_weights (not yet in use until #513).
      
      * Trickle down deterministic to blocks.
      
      * Rename q, k, v according to the latest PyTorch version.
      
      Note that weights were exported with the old names, so we need to be
      careful.
      
      * Flax UNet docstrings, default props as in PyTorch.
      
      * Fix minor typos in PyTorch docstrings.
      
      * Use FlaxUNet2DConditionOutput as output from UNet.
      
      * make style
      Co-authored-by: default avatarMishig Davaadorj <dmishig@gmail.com>
      Co-authored-by: default avatarMishig Davaadorj <mishig.davaadorj@coloradocollege.edu>
      Co-authored-by: default avatarSuraj Patil <surajp815@gmail.com>
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      d8b0e4f4
    • Kashif Rasul's avatar
      Karras VE, DDIM and DDPM flax schedulers (#508) · b34be039
      Kashif Rasul authored
      * beta never changes removed from state
      
      * fix typos in docs
      
      * removed unused var
      
      * initial ddim flax scheduler
      
      * import
      
      * added dummy objects
      
      * fix style
      
      * fix typo
      
      * docs
      
      * fix typo in comment
      
      * set return type
      
      * added flax ddom
      
      * fix style
      
      * remake
      
      * pass PRNG key as argument and split before use
      
      * fix doc string
      
      * use config
      
      * added flax Karras VE scheduler
      
      * make style
      
      * fix dummy
      
      * fix ndarray type annotation
      
      * replace returns a new state
      
      * added lms_discrete scheduler
      
      * use self.config
      
      * add_noise needs state
      
      * use config
      
      * use config
      
      * docstring
      
      * added flax score sde ve
      
      * fix imports
      
      * fix typos
      b34be039
  18. 14 Sep, 2022 1 commit
  19. 13 Sep, 2022 2 commits
  20. 08 Sep, 2022 3 commits
  21. 05 Sep, 2022 1 commit
  22. 01 Sep, 2022 1 commit
  23. 31 Aug, 2022 2 commits
  24. 30 Aug, 2022 1 commit
  25. 22 Aug, 2022 2 commits
  26. 17 Aug, 2022 2 commits
  27. 16 Aug, 2022 2 commits
  28. 14 Aug, 2022 1 commit