- 26 Mar, 2024 1 commit
-
-
Sayak Paul authored
* feat: support dora loras from community * safe-guard dora operations under peft version. * pop use_dora when False * make dora lora from kohya work. * fix: kohya conversion utils. * add a fast test for DoRA compatibility.. * add a nightly test.
-
- 25 Mar, 2024 1 commit
-
-
UmerHA authored
* Update test_lora_layers_peft.py * Update utils.py
-
- 20 Mar, 2024 1 commit
-
-
Sayak Paul authored
* cleanse and refactor lora testing suite. * more cleanup. * make check_if_lora_correctly_set a utility function * fix: typo * retrigger ci * style
-
- 19 Mar, 2024 1 commit
-
-
Sayak Paul authored
* debugging * let's see the numbers * let's see the numbers * let's see the numbers * restrict tolerance. * increase inference steps. * shallow copy of cross_attentionkwargs * remove print
-
- 27 Feb, 2024 2 commits
-
-
Younes Belkada authored
* copy the state dict in load lora weights * fixup
-
jinghuan-Chen authored
* Make LoRACompatibleConv padding_mode work. * Format code style. * add fast test * Update src/diffusers/models/lora.py Simplify the code by patrickvonplaten. Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> * code refactor * apply patrickvonplaten suggestion to simplify the code. * rm test_lora_layers_old_backend.py and add test case in test_lora_layers_peft.py * update test case. --------- Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> Co-authored-by:
YiYi Xu <yixu310@gmail.com>
-
- 13 Feb, 2024 1 commit
-
-
Dhruv Nair authored
update
-
- 09 Feb, 2024 1 commit
-
-
Sayak Paul authored
* deprecate certain lora methods from the old backend. * uncomment necessary things. * safe remove old lora backend
👋
-
- 08 Feb, 2024 1 commit
-
-
Sayak Paul authored
change to 2024
-
- 22 Jan, 2024 1 commit
-
-
Dhruv Nair authored
* update * update
-
- 05 Jan, 2024 2 commits
-
-
Sayak Paul authored
* introduce unload_lora. * fix-copies
-
Sayak Paul authored
* edebug * debug * more debug * more more debug * remove tests for LoRAAttnProcessors. * rename
-
- 04 Jan, 2024 5 commits
-
-
Sayak Paul authored
* debug * debug test_with_different_scales_fusion_equivalence * use the right method. * place it right. * let's see. * let's see again * alright then. * add a comment.
-
sayakpaul authored
-
sayakpaul authored
-
- 03 Jan, 2024 2 commits
-
-
Sayak Paul authored
* handle rest of the stuff related to deprecated lora stuff. * fix: copies * don't modify the uNet in-place. * fix: temporal autoencoder. * manually remove lora layers. * don't copy unet. * alright * remove lora attn processors from unet3d * fix: unet3d. * styl * Empty-Commit
-
Sayak Paul authored
* add: test to check if peft loras are loadable in non-peft envs. * add torch_device approrpiately. * fix: get_dummy_inputs(). * test logits. * rename * debug * debug * fix: generator * new assertion values after fixing the seed. * shape * remove print statements and settle this. * to update values. * change values when lora config is initialized under a fixed seed. * update colab link * update notebook link * sanity restored by getting the exact same values without peft.
-
- 02 Jan, 2024 1 commit
-
-
Sayak Paul authored
* start deprecating loraattn. * fix * wrap into unet_lora_state_dict * utilize text_encoder_lora_params * utilize text_encoder_attn_modules * debug * debug * remove print * don't use text encoder for test_stable_diffusion_lora * load the procs. * set_default_attn_processor * fix: set_default_attn_processor call. * fix: lora_components[unet_lora_params] * checking for 3d. * 3d. * more fixes. * debug * debug * debug * debug * more debug * more debug * more debug * more debug * more debug * more debug * hack. * remove comments and prep for a PR. * appropriate set_lora_weights() * fix * fix: test_unload_lora_sd * fix: test_unload_lora_sd * use dfault attebtion processors. * debu * debug nan * debug nan * debug nan * use NaN instead of inf * remove comments. * fix: test_text_encoder_lora_state_dict_unchanged * attention processor default * default attention processors. * default * style
-
- 30 Dec, 2023 1 commit
-
-
Sayak Paul authored
remove unnecessary components from lora peft suite/
-
- 26 Dec, 2023 1 commit
-
-
Younes Belkada authored
* add adapter_name in fuse * add tesrt * up * fix CI * adapt from suggestion * Update src/diffusers/utils/testing_utils.py Co-authored-by:
Benjamin Bossan <BenjaminBossan@users.noreply.github.com> * change to `require_peft_version_greater` * change variable names in test * Update src/diffusers/loaders/lora.py Co-authored-by:
Benjamin Bossan <BenjaminBossan@users.noreply.github.com> * break into 2 lines * final comments --------- Co-authored-by:
Sayak Paul <spsayakpaul@gmail.com> Co-authored-by:
Benjamin Bossan <BenjaminBossan@users.noreply.github.com>
-
- 25 Dec, 2023 1 commit
-
-
Sayak Paul authored
* fix: lora peft dummy components * fix: dummy components
-
- 24 Dec, 2023 2 commits
-
-
Dhruv Nair authored
update Co-authored-by:Sayak Paul <spsayakpaul@gmail.com>
-
Sayak Paul authored
* initialize alpha too. * add: test * remove config parsing * store rank * debug * remove faulty test
-
- 22 Dec, 2023 1 commit
-
-
Dhruv Nair authored
update
-
- 21 Dec, 2023 1 commit
-
-
Benjamin Bossan authored
See #6185 for context. Co-authored-by:Sayak Paul <spsayakpaul@gmail.com>
-
- 15 Dec, 2023 1 commit
-
-
Dhruv Nair authored
* update * update * update * update --------- Co-authored-by:Sayak Paul <spsayakpaul@gmail.com>
-
- 21 Nov, 2023 1 commit
-
-
YiYi Xu authored
* add ip-adapter --------- Co-authored-by:
okotaku <to78314910@gmail.com> Co-authored-by:
sayakpaul <spsayakpaul@gmail.com> Co-authored-by:
yiyixuxu <yixu310@gmail,com> Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> Co-authored-by:
Steven Liu <59462357+stevhliu@users.noreply.github.com>
-
- 14 Nov, 2023 2 commits
-
-
Sayak Paul authored
* refactor loaders.py to make it cleaner and leaner. * refactor loaders init * inits. * textual inversion to the init. * inits. * remove certain modules from the main init. * AttnProcsLayers * fix imports * avoid circular import. * fix circular import pt 2. * address PR comments * imports * fix: imports. * remove from main init for avoiding circular deps. * remove spurious deps. * fix-copies. * fix imports. * more debug * more debug * Apply suggestions from code review * Apply suggestions from code review --------- Co-authored-by:Patrick von Platen <patrick.v.platen@gmail.com>
-
Sourab Mangrulkar authored
* add lora delete feature * added tests and changed condition * deal with corner cases * more corner cases * rename to `delete_adapter_layers` for consistency --------- Co-authored-by:younesbelkada <younesbelkada@gmail.com>
-
- 09 Nov, 2023 1 commit
-
-
Patrick von Platen authored
* lcm add tests * uP * Fix all * uP * Add * all * uP * uP * uP * uP * uP * uP * uP
-
- 01 Nov, 2023 1 commit
-
-
Younes Belkada authored
* fix civitai bug * add test * up * fix test * added slow test. * style * Update src/diffusers/utils/peft_utils.py Co-authored-by:
Benjamin Bossan <BenjaminBossan@users.noreply.github.com> * Update src/diffusers/utils/peft_utils.py --------- Co-authored-by:
Benjamin Bossan <BenjaminBossan@users.noreply.github.com>
-
- 21 Oct, 2023 1 commit
-
-
Younes Belkada authored
* fix scale unscale v1 * final fixes + CI * fix slow trst * oops * fix copies * oops * oops * fix * style * fix copies --------- Co-authored-by:Sayak Paul <spsayakpaul@gmail.com>
-
- 13 Oct, 2023 1 commit
-
-
Younes Belkada authored
* v1 * add tests and fix previous failing tests * fix CI * add tests + v1 `PeftLayerScaler` * style * add scale retrieving mechanism system * fix CI * up * up * simple approach --> not same results for some reason * fix issues * fix copies * remove unneeded method * active adapters! * fix merge conflicts * up * up * kohya - test-1 * Apply suggestions from code review Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> * fix scale * fix copies * add comment * multi adapters * fix tests * oops * v1 faster loading - in progress * Revert "v1 faster loading - in progress" This reverts commit ac925f81321e95fc8168184c3346bf3d75404d5a. * kohya same generation * fix some slow tests * peft integration features for unet lora 1. Support for Multiple ranks/alphas 2. Support for Multiple active adapters 3. Support for enabling/disabling LoRAs * fix `get_peft_kwargs` * Update loaders.py * add some tests * add unfuse tests * fix tests * up * add set adapter from sourab and tests * fix multi adapter tests * style & quality * style * remove comment * fix `adapter_name` issues * fix unet adapter name for sdxl * fix enabling/disabling adapters * fix fuse / unfuse unet * nit * fix * up * fix cpu offloading * fix another slow test * fix another offload test * add more tests * all slow tests pass * style * fix alpha pattern for unet and text encoder * Update src/diffusers/loaders.py Co-authored-by:
Benjamin Bossan <BenjaminBossan@users.noreply.github.com> * Update src/diffusers/models/attention.py Co-authored-by:
Benjamin Bossan <BenjaminBossan@users.noreply.github.com> * up * up * clarify comment * comments * change comment order * change comment order * stylr & quality * Update tests/lora/test_lora_layers_peft.py Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> * fix bugs and add tests * Update src/diffusers/models/modeling_utils.py Co-authored-by:
Benjamin Bossan <BenjaminBossan@users.noreply.github.com> * Update src/diffusers/models/modeling_utils.py Co-authored-by:
Benjamin Bossan <BenjaminBossan@users.noreply.github.com> * refactor * suggestion * add break statemebt * add compile tests * move slow tests to peft tests as I modified them * quality * refactor a bit * style * change import * style * fix CI * refactor slow tests one last time * style * oops * oops * oops * final tweak tests * Apply suggestions from code review Co-authored-by:
Sayak Paul <spsayakpaul@gmail.com> * Update src/diffusers/loaders.py Co-authored-by:
Sayak Paul <spsayakpaul@gmail.com> * comments * Apply suggestions from code review Co-authored-by:
Sayak Paul <spsayakpaul@gmail.com> * remove comments * more comments * try * revert * add `safe_merge` tests * add comment * style, comments and run tests in fp16 * add warnings * fix doc test * replace with `adapter_weights` * add `get_active_adapters()` * expose `get_list_adapters` method * better error message * Apply suggestions from code review Co-authored-by:
Steven Liu <59462357+stevhliu@users.noreply.github.com> * style * trigger slow lora tests * fix tests * maybe fix last test * revert * Update src/diffusers/loaders.py Co-authored-by:
Benjamin Bossan <BenjaminBossan@users.noreply.github.com> * Update src/diffusers/loaders.py Co-authored-by:
Benjamin Bossan <BenjaminBossan@users.noreply.github.com> * Update src/diffusers/loaders.py Co-authored-by:
Benjamin Bossan <BenjaminBossan@users.noreply.github.com> * Update src/diffusers/loaders.py Co-authored-by:
Benjamin Bossan <BenjaminBossan@users.noreply.github.com> * Apply suggestions from code review Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> Co-authored-by:
Sayak Paul <spsayakpaul@gmail.com> * move `MIN_PEFT_VERSION` * Apply suggestions from code review Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> * let's not use class variable * fix few nits * change a bit offloading logic * check earlier * rm unneeded block * break long line * return empty list * change logic a bit and address comments * add typehint * remove parenthesis * fix * revert to fp16 in tests * add to gpu * revert to old test * style * Update src/diffusers/loaders.py Co-authored-by:
Benjamin Bossan <BenjaminBossan@users.noreply.github.com> * change indent * Apply suggestions from code review * Apply suggestions from code review --------- Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> Co-authored-by:
Sourab Mangrulkar <13534540+pacman100@users.noreply.github.com> Co-authored-by:
Benjamin Bossan <BenjaminBossan@users.noreply.github.com> Co-authored-by:
Sayak Paul <spsayakpaul@gmail.com> Co-authored-by:
Steven Liu <59462357+stevhliu@users.noreply.github.com>
-
- 12 Oct, 2023 1 commit
-
-
Dhruv Nair authored
* move xformers to dedicated runner * fix * remove ptl from test runner images
-
- 09 Oct, 2023 1 commit
-
-
Patrick von Platen authored
* Fix fuse Lora * improve a bit * make style * Update src/diffusers/models/lora.py Co-authored-by:
Benjamin Bossan <BenjaminBossan@users.noreply.github.com> * ciao C file * ciao C file * test & make style --------- Co-authored-by:
Benjamin Bossan <BenjaminBossan@users.noreply.github.com>
-
- 04 Oct, 2023 1 commit
-
-
Dhruv Nair authored
* pipline fetcher * update script * clean up * clean up * clean up * new pipeline runner * rename tests to match modules * test actions in pr * change runner to gpu * clean up * clean up * clean up * fix report * fix reporting * clean up * show test stats in failure reports * give names to jobs * add lora tests * split torch cuda tests and add compile tests * clean up * fix tests * change push to run only on main --------- Co-authored-by:Patrick von Platen <patrick.v.platen@gmail.com>
-
- 27 Sep, 2023 1 commit
-
-
Patrick von Platen authored
* fix xformers lora * improve * fix
-
- 26 Sep, 2023 1 commit
-
-
Dhruv Nair authored
* fix other tests * fix tests * fix tests * Update tests/pipelines/shap_e/test_shap_e_img2img.py * Update tests/pipelines/shap_e/test_shap_e_img2img.py Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> * fix upstream merge mistake * fix tests: * test fix * Update tests/lora/test_lora_layers_old_backend.py Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> * Update tests/lora/test_lora_layers_old_backend.py Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> --------- Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com>
-