1. 08 Jul, 2024 1 commit
  2. 03 Jul, 2024 2 commits
  3. 02 Jul, 2024 1 commit
  4. 01 Jul, 2024 1 commit
  5. 26 Jun, 2024 2 commits
  6. 21 Jun, 2024 1 commit
  7. 20 Jun, 2024 1 commit
  8. 19 Jun, 2024 1 commit
  9. 18 Jun, 2024 1 commit
  10. 12 Jun, 2024 2 commits
  11. 10 Jun, 2024 1 commit
  12. 05 Jun, 2024 2 commits
  13. 01 Jun, 2024 1 commit
  14. 31 May, 2024 1 commit
    • Sayak Paul's avatar
      [Core] Introduce class variants for `Transformer2DModel` (#7647) · 983dec3b
      Sayak Paul authored
      * init for patches
      
      * finish patched model.
      
      * continuous transformer
      
      * vectorized transformer2d.
      
      * style.
      
      * inits.
      
      * fix-copies.
      
      * introduce DiTTransformer2DModel.
      
      * fixes
      
      * use REMAPPING as suggested by @DN6
      
      * better logging.
      
      * add pixart transformer model.
      
      * inits.
      
      * caption_channels.
      
      * attention masking.
      
      * fix use_additional_conditions.
      
      * remove print.
      
      * debug
      
      * flatten
      
      * fix: assertion for sigma
      
      * handle remapping for modeling_utils
      
      * add tests for dit transformer2d
      
      * quality
      
      * placeholder for pixart tests
      
      * pixart tests
      
      * add _no_split_modules
      
      * add docs.
      
      * check
      
      * check
      
      * check
      
      * check
      
      * fix tests
      
      * fix tests
      
      * move Transformer output to modeling_output
      
      * move errors better and bring back use_additional_conditions attribute.
      
      * add unnecessary things from DiT.
      
      * clean up pixart
      
      * fix remapping
      
      * fix device_map things in pixart2d.
      
      * replace Transformer2DModel with appropriate classes in dit, pixart tests
      
      * empty
      
      * legacy mixin classes./
      
      * use a remapping dict for fetching class names.
      
      * change to specifc model types in the pipeline implementations.
      
      * move _fetch_remapped_cls_from_config to modeling_loading_utils.py
      
      * fix dependency problems.
      
      * add deprecation note.
      983dec3b
  15. 29 May, 2024 1 commit
  16. 10 May, 2024 1 commit
    • Mark Van Aken's avatar
      #7535 Update FloatTensor type hints to Tensor (#7883) · be4afa0b
      Mark Van Aken authored
      * find & replace all FloatTensors to Tensor
      
      * apply formatting
      
      * Update torch.FloatTensor to torch.Tensor in the remaining files
      
      * formatting
      
      * Fix the rest of the places where FloatTensor is used as well as in documentation
      
      * formatting
      
      * Update new file from FloatTensor to Tensor
      be4afa0b
  17. 30 Apr, 2024 1 commit
    • Sayak Paul's avatar
      [Core] introduce _no_split_modules to `ModelMixin` (#6396) · 3fd31eef
      Sayak Paul authored
      * introduce _no_split_modules.
      
      * unnecessary spaces.
      
      * remove unnecessary kwargs and style
      
      * fix: accelerate imports.
      
      * change to _determine_device_map
      
      * add the blocks that have residual connections.
      
      * add: CrossAttnUpBlock2D
      
      * add: testin
      
      * style
      
      * line-spaces
      
      * quality
      
      * add disk offload test without safetensors.
      
      * checking disk offloading percentages.
      
      * change model split
      
      * add: utility for checking multi-gpu requirement.
      
      * model parallelism test
      
      * splits.
      
      * splits.
      
      * splits
      
      * splits.
      
      * splits.
      
      * splits.
      
      * offload folder to test_disk_offload_with_safetensors
      
      * add _no_split_modules
      
      * fix-copies
      3fd31eef
  18. 24 Apr, 2024 1 commit
  19. 10 Apr, 2024 1 commit
  20. 03 Apr, 2024 1 commit
    • Sayak Paul's avatar
      [Core] refactor transformers 2d into multiple init variants. (#7491) · a9a5b14f
      Sayak Paul authored
      * refactor transformers 2d into multiple legacy variants.
      
      * fix: init.
      
      * fix recursive init.
      
      * add inits.
      
      * make transformer block creation more modular.
      
      * complete refactor.
      
      * remove forward
      
      * debug
      
      * remove legacy blocks and refactor within the module itself.
      
      * remove print
      
      * guard caption projection
      
      * remove fetcher.
      
      * reduce the number of args.
      
      * fix: norm_type
      
      * group variables that are shared.
      
      * remove _get_transformer_blocks
      
      * harmonize the init function signatures.
      
      * transformer_blocks to common
      
      * repeat .
      a9a5b14f
  21. 02 Apr, 2024 2 commits
  22. 01 Apr, 2024 1 commit
  23. 18 Mar, 2024 1 commit
    • M. Tolga Cangöz's avatar
      Fix Typos (#7325) · 6a05b274
      M. Tolga Cangöz authored
      * Fix PyTorch's convention for inplace functions
      
      * Fix import structure in __init__.py and update config loading logic in test_config.py
      
      * Update configuration access
      
      * Fix typos
      
      * Trim trailing white spaces
      
      * Fix typo in logger name
      
      * Revert "Fix PyTorch's convention for inplace functions"
      
      This reverts commit f65dc4afcb57ceb43d5d06389229d47bafb10d2d.
      
      * Fix typo in step_index property description
      
      * Revert "Update configuration access"
      
      This reverts commit 8d44e870b8c1ad08802e3e904c34baeca1b598f8.
      
      * Revert "Fix import structure in __init__.py and update config loading logic in test_config.py"
      
      This reverts commit 2ad5e8bca25aede3b912da22bd57285b598fe171.
      
      * Fix typos
      
      * Fix typos
      
      * Fix typos
      
      * Fix a typo: tranform -> transform
      6a05b274
  24. 14 Mar, 2024 1 commit
    • M. Tolga Cangöz's avatar
      [`Tests`] Update a deprecated parameter in test files and fix several typos (#7277) · 5d848ec0
      M. Tolga Cangöz authored
      * Add properties and `IPAdapterTesterMixin` tests for `StableDiffusionPanoramaPipeline`
      
      * Fix variable name typo and update comments
      
      * Update deprecated `output_type="numpy"` to "np" in test files
      
      * Discard changes to src/diffusers/pipelines/stable_diffusion_panorama/pipeline_stable_diffusion_panorama.py
      
      * Update test_stable_diffusion_panorama.py
      
      * Update numbers in README.md
      
      * Update get_guidance_scale_embedding method to use timesteps instead of w
      
      * Update number of checkpoints in README.md
      
      * Add type hints and fix var name
      
      * Fix PyTorch's convention for inplace functions
      
      * Fix a typo
      
      * Revert "Fix PyTorch's convention for inplace functions"
      
      This reverts commit 74350cf65b2c9aa77f08bec7937d7a8b13edb509.
      
      * Fix typos
      
      * Indent
      
      * Refactor get_guidance_scale_embedding method in LEditsPPPipelineStableDiffusionXL class
      5d848ec0
  25. 13 Mar, 2024 1 commit
    • Sayak Paul's avatar
      [LoRA] use the PyTorch classes wherever needed and start depcrecation cycles (#7204) · 531e7191
      Sayak Paul authored
      * fix PyTorch classes and start deprecsation cycles.
      
      * remove args crafting for accommodating scale.
      
      * remove scale check in feedforward.
      
      * assert against nn.Linear and not CompatibleLinear.
      
      * remove conv_cls and lineaR_cls.
      
      * remove scale
      
      * 👋
      
       scale.
      
      * fix: unet2dcondition
      
      * fix attention.py
      
      * fix: attention.py again
      
      * fix: unet_2d_blocks.
      
      * fix-copies.
      
      * more fixes.
      
      * fix: resnet.py
      
      * more fixes
      
      * fix i2vgenxl unet.
      
      * depcrecate scale gently.
      
      * fix-copies
      
      * Apply suggestions from code review
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * quality
      
      * throw warning when scale is passed to the the BasicTransformerBlock class.
      
      * remove scale from signature.
      
      * cross_attention_kwargs, very nice catch by Yiyi
      
      * fix: logger.warn
      
      * make deprecation message clearer.
      
      * address final comments.
      
      * maintain same depcrecation message and also add it to activations.
      
      * address yiyi
      
      * fix copies
      
      * Apply suggestions from code review
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      
      * more depcrecation
      
      * fix-copies
      
      ---------
      Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
      531e7191
  26. 08 Mar, 2024 1 commit
  27. 03 Mar, 2024 1 commit
  28. 08 Feb, 2024 1 commit
  29. 29 Jan, 2024 1 commit