1. 11 Jul, 2024 3 commits
    • Sayak Paul's avatar
      [Core] Add AuraFlow (#8796) · 2261510b
      Sayak Paul authored
      
      
      * add lavender flow transformer
      
      ---------
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      2261510b
    • Álvaro Somoza's avatar
      [Core] Add Kolors (#8812) · 87b9db64
      Álvaro Somoza authored
      * initial draft
      87b9db64
    • Xin Ma's avatar
      Latte: Latent Diffusion Transformer for Video Generation (#8404) · b8cf84a3
      Xin Ma authored
      
      
      * add Latte to diffusers
      
      * remove print
      
      * remove print
      
      * remove print
      
      * remove unuse codes
      
      * remove layer_norm_latte and add a flag
      
      * remove layer_norm_latte and add a flag
      
      * update latte_pipeline
      
      * update latte_pipeline
      
      * remove unuse squeeze
      
      * add norm_hidden_states.ndim == 2: # for Latte
      
      * fixed test latte pipeline bugs
      
      * fixed test latte pipeline bugs
      
      * delete sh
      
      * add doc for latte
      
      * add licensing
      
      * Move Transformer3DModelOutput to modeling_outputs
      
      * give a default value to sample_size
      
      * remove the einops dependency
      
      * change norm2 for latte
      
      * modify pipeline of latte
      
      * update test for Latte
      
      * modify some codes for latte
      
      * modify for Latte pipeline
      
      * modify for Latte pipeline
      
      * modify for Latte pipeline
      
      * modify for Latte pipeline
      
      * modify for Latte pipeline
      
      * modify for Latte pipeline
      
      * modify for Latte pipeline
      
      * modify for Latte pipeline
      
      * modify for Latte pipeline
      
      * modify for Latte pipeline
      
      * modify for Latte pipeline
      
      * modify for Latte pipeline
      
      * modify for Latte pipeline
      
      * modify for Latte pipeline
      
      * modify for Latte pipeline
      
      * modify for Latte pipeline
      
      * modify for Latte pipeline
      
      * modify for Latte pipeline
      
      * modify for Latte pipeline
      
      * modify for Latte pipeline
      
      * modify for Latte pipeline
      
      * modify for Latte pipeline
      
      * modify for Latte pipeline
      
      * modify for Latte pipeline
      
      * modify for Latte pipeline
      
      * modify for Latte pipeline
      
      * modify for Latte pipeline
      
      * video_length -> num_frames; update prepare_latents copied from
      
      * make fix-copies
      
      * make style
      
      * typo: videe -> video
      
      * update
      
      * modify for Latte pipeline
      
      * modify latte pipeline
      
      * modify latte pipeline
      
      * modify latte pipeline
      
      * modify latte pipeline
      
      * modify for Latte pipeline
      
      * Delete .vscode directory
      
      * make style
      
      * make fix-copies
      
      * add latte transformer 3d to docs _toctree.yml
      
      * update example
      
      * reduce frames for test
      
      * fixed bug of _text_preprocessing
      
      * set num frame to 1 for testing
      
      * remove unuse print
      
      * add text = self._clean_caption(text) again
      
      ---------
      Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      Co-authored-by: default avatarAryan <contact.aryanvs@gmail.com>
      Co-authored-by: default avatarAryan <aryan@huggingface.co>
      b8cf84a3
  2. 09 Jul, 2024 2 commits
  3. 08 Jul, 2024 2 commits
  4. 06 Jul, 2024 1 commit
  5. 05 Jul, 2024 1 commit
  6. 04 Jul, 2024 1 commit
  7. 03 Jul, 2024 3 commits
  8. 29 Jun, 2024 1 commit
  9. 28 Jun, 2024 1 commit
  10. 27 Jun, 2024 2 commits
    • Mathis Koroglu's avatar
      Motion Model / Adapter versatility (#8301) · 3e0d128d
      Mathis Koroglu authored
      * Motion Model / Adapter versatility
      
      - allow to use a different number of layers per block
      - allow to use a different number of transformer per layers per block
      - allow a different number of motion attention head per block
      - use dropout argument in get_down/up_block in 3d blocks
      
      * Motion Model added arguments renamed & refactoring
      
      * Add test for asymmetric UNetMotionModel
      3e0d128d
    • vincedovy's avatar
      Fix json WindowsPath crash (#8662) · a536e775
      vincedovy authored
      
      
      * Add check for WindowsPath in to_json_string
      
      On Windows, os.path.join returns a WindowsPath. to_json_string does not convert this from a WindowsPath to a string. Added check for WindowsPath to to_json_saveable.
      
      * Remove extraneous convert to string in test_check_path_types (tests/others/test_config.py)
      
      * Fix style issues in tests/others/test_config.py
      
      * Add unit test to test_config.py to verify that PosixPath and WindowsPath (depending on system) both work when converted to JSON
      
      * Remove distinction between PosixPath and WindowsPath in ConfigMixIn.to_json_string(). Conditional now tests for Path, and uses Path.as_posix() to convert to string.
      
      ---------
      Co-authored-by: default avatarVincent Dovydaitis <vincedovy@gmail.com>
      Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
      a536e775
  11. 26 Jun, 2024 3 commits
  12. 25 Jun, 2024 4 commits
  13. 24 Jun, 2024 5 commits
  14. 21 Jun, 2024 3 commits
  15. 20 Jun, 2024 1 commit
  16. 19 Jun, 2024 2 commits
  17. 18 Jun, 2024 2 commits
  18. 13 Jun, 2024 1 commit
  19. 12 Jun, 2024 1 commit
  20. 07 Jun, 2024 1 commit
    • Sayak Paul's avatar
      [Core] support saving and loading of sharded checkpoints (#7830) · 7d887118
      Sayak Paul authored
      
      
      * feat: support saving a model in sharded checkpoints.
      
      * feat: make loading of sharded checkpoints work.
      
      * add tests
      
      * cleanse the loading logic a bit more.
      
      * more resilience while loading from the Hub.
      
      * parallelize shard downloads by using snapshot_download()/
      
      * default to a shard size.
      
      * more fix
      
      * Empty-Commit
      
      * debug
      
      * fix
      
      * uality
      
      * more debugging
      
      * fix more
      
      * initial comments from Benjamin
      
      * move certain methods to loading_utils
      
      * add test to check if the correct number of shards are present.
      
      * add a test to check if loading of sharded checkpoints from the Hub is okay
      
      * clarify the unit when passed as an int.
      
      * use hf_hub for sharding.
      
      * remove unnecessary code
      
      * remove unnecessary function
      
      * lucain's comments.
      
      * fixes
      
      * address high-level comments.
      
      * fix test
      
      * subfolder shenanigans./
      
      * Update src/diffusers/utils/hub_utils.py
      Co-authored-by: default avatarLucain <lucainp@gmail.com>
      
      * Apply suggestions from code review
      Co-authored-by: default avatarLucain <lucainp@gmail.com>
      
      * remove _huggingface_hub_version as not needed.
      
      * address more feedback.
      
      * add a test for local_files_only=True/
      
      * need hf hub to be at least 0.23.2
      
      * style
      
      * final comment.
      
      * clean up subfolder.
      
      * deal with suffixes in code.
      
      * _add_variant default.
      
      * use weights_name_pattern
      
      * remove add_suffix_keyword
      
      * clean up downloading of sharded ckpts.
      
      * don't return something special when using index.json
      
      * fix more
      
      * don't use bare except
      
      * remove comments and catch the errors better
      
      * fix a couple of things when using is_file()
      
      * empty
      
      ---------
      Co-authored-by: default avatarLucain <lucainp@gmail.com>
      7d887118