"src/vscode:/vscode.git/clone" did not exist on "1a69c6ff0e76e8e025213a34eb298ca1428d6fc3"
  1. 28 Sep, 2022 1 commit
  2. 27 Sep, 2022 10 commits
    • Pedro Cuenca's avatar
      Fix `main`: stable diffusion pipelines cannot be loaded (#655) · 235770dd
      Pedro Cuenca authored
      * Replace deprecation warning f-string with class name.
      
      When `__repr__` is invoked in the instance serialization of
      `config_dict` fails, because it contains `kwargs` of type `<class
      inspect._empty>`.
      
      * Revert "Replace deprecation warning f-string with class name."
      
      This reverts commit 1c4eb8cb104374bd84e43865fc3865862473799c.
      
      * Do not attempt to register `"kwargs"` as an attribute.
      
      Otherwise serialization could fail.
      This may happen for other attributes, so we should create a better
      solution.
      235770dd
    • Anton Lozhkov's avatar
      Fix onnx tensor format (#654) · d8572f20
      Anton Lozhkov authored
      fix np onnx
      d8572f20
    • Kashif Rasul's avatar
      [Pytorch] add dep. warning for pytorch schedulers (#651) · 85494e88
      Kashif Rasul authored
      * add dep. warning for schedulers
      
      * fix format
      85494e88
    • Suraj Patil's avatar
      [DDIM, DDPM] fix add_noise (#648) · 33045382
      Suraj Patil authored
      fix add noise
      33045382
    • Kashif Rasul's avatar
      [Pytorch] Pytorch only schedulers (#534) · bd8df2da
      Kashif Rasul authored
      
      
      * pytorch only schedulers
      
      * fix style
      
      * remove match_shape
      
      * pytorch only ddpm
      
      * remove SchedulerMixin
      
      * remove numpy from karras_ve
      
      * fix types
      
      * remove numpy from lms_discrete
      
      * remove numpy from pndm
      
      * fix typo
      
      * remove mixin and numpy from sde_vp and ve
      
      * remove remaining tensor_format
      
      * fix style
      
      * sigmas has to be torch tensor
      
      * removed set_format in readme
      
      * remove set format from docs
      
      * remove set_format from pipelines
      
      * update tests
      
      * fix typo
      
      * continue to use mixin
      
      * fix imports
      
      * removed unsed imports
      
      * match shape instead of assuming image shapes
      
      * remove import typo
      
      * update call to add_noise
      
      * use math instead of numpy
      
      * fix t_index
      
      * removed commented out numpy tests
      
      * timesteps needs to be discrete
      
      * cast timesteps to int in flax scheduler too
      
      * fix device mismatch issue
      
      * small fix
      
      * Update src/diffusers/schedulers/scheduling_pndm.py
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      bd8df2da
    • Yih-Dar's avatar
      Fix `SpatialTransformer` (#578) · d886e497
      Yih-Dar authored
      
      
      * Fix SpatialTransformer
      
      * Fix SpatialTransformer
      Co-authored-by: default avatarydshieh <ydshieh@users.noreply.github.com>
      d886e497
    • Pedro Cuenca's avatar
      Flax pipeline pndm (#583) · ab3fd671
      Pedro Cuenca authored
      
      
      * WIP: flax FlaxDiffusionPipeline & FlaxStableDiffusionPipeline
      
      * todo comment
      
      * Fix imports
      
      * Fix imports
      
      * add dummies
      
      * Fix empty init
      
      * make pipeline work
      
      * up
      
      * Allow dtype to be overridden on model load.
      
      This may be a temporary solution until #567 is addressed.
      
      * Convert params to bfloat16 or fp16 after loading.
      
      This deals with the weights, not the model.
      
      * Use Flax schedulers (typing, docstring)
      
      * PNDM: replace control flow with jax functions.
      
      Otherwise jitting/parallelization don't work properly as they don't know
      how to deal with traced objects.
      
      I temporarily removed `step_prk`.
      
      * Pass latents shape to scheduler set_timesteps()
      
      PNDMScheduler uses it to reserve space, other schedulers will just
      ignore it.
      
      * Wrap model imports inside availability checks.
      
      * Optionally return state in from_config.
      
      Useful for Flax schedulers.
      
      * Do not convert model weights to dtype.
      
      * Re-enable PRK steps with functional implementation.
      
      Values returned still not verified for correctness.
      
      * Remove left over has_state var.
      
      * make style
      
      * Apply suggestion list -> tuple
      Co-authored-by: default avatarSuraj Patil <surajp815@gmail.com>
      
      * Apply suggestion list -> tuple
      Co-authored-by: default avatarSuraj Patil <surajp815@gmail.com>
      
      * Remove unused comments.
      
      * Use zeros instead of empty.
      Co-authored-by: default avatarMishig Davaadorj <dmishig@gmail.com>
      Co-authored-by: default avatarMishig Davaadorj <mishig.davaadorj@coloradocollege.edu>
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      Co-authored-by: default avatarSuraj Patil <surajp815@gmail.com>
      ab3fd671
    • Pedro Cuenca's avatar
      c070e5f0
    • Pedro Cuenca's avatar
      Remove deprecated `torch_device` kwarg (#623) · b671cb09
      Pedro Cuenca authored
      * Remove deprecated `torch_device` kwarg.
      
      * Remove unused imports.
      b671cb09
    • Yuta Hayashibe's avatar
      Warning for too long prompts in DiffusionPipelines (Resolve #447) (#472) · f7ebe569
      Yuta Hayashibe authored
      * Return encoded texts by DiffusionPipelines
      
      * Updated README to show hot to use enoded_text_input
      
      * Reverted examples in README.md
      
      * Reverted all
      
      * Warning for long prompts
      
      * Fix bugs
      
      * Formatted
      f7ebe569
  3. 24 Sep, 2022 2 commits
  4. 23 Sep, 2022 6 commits
  5. 22 Sep, 2022 4 commits
  6. 21 Sep, 2022 9 commits
  7. 20 Sep, 2022 7 commits
  8. 19 Sep, 2022 1 commit