"packaging/vscode:/vscode.git/clone" did not exist on "1836c786fecca9e83b198a8cc9aba1df6fda3d4a"
- 18 Sep, 2023 1 commit
-
-
Ruoxi authored
* Implement `CustomDiffusionAttnProcessor2_0` * Doc-strings and type annotations for `CustomDiffusionAttnProcessor2_0`. (#1) * Update attnprocessor.md * Update attention_processor.py * Interops for `CustomDiffusionAttnProcessor2_0`. * Formatted `attention_processor.py`. * Formatted doc-string in `attention_processor.py` * Conditional CustomDiffusion2_0 for training example. * Remove unnecessary reference impl in comments. * Fix `save_attn_procs`.
-
- 26 Jul, 2023 1 commit
-
-
camenduru authored
* why mdx? * why mdx? * why mdx? * no x for kandinksy either --------- Co-authored-by:Patrick von Platen <patrick.v.platen@gmail.com>
-
- 06 Jun, 2023 1 commit
-
-
Sayak Paul authored
* feat: add lora attention processor for pt 2.0. * explicit context manager for SDPA. * switch to flash attention * make shapes compatible to work optimally with SDPA. * fix: circular import problem. * explicitly specify the flash attention kernel in sdpa * fall back to efficient attention context manager. * remove explicit dispatch. * fix: removed processor. * fix: remove optional from type annotation. * feat: make changes regarding LoRAAttnProcessor2_0. * remove confusing warning. * formatting. * relax tolerance for PT 2.0 * fix: loading message. * remove unnecessary logging. * add: entry to the docs. * add: network_alpha argument. * relax tolerance.
-
- 26 May, 2023 1 commit
-
-
Steven Liu authored
* add attnprocessor to docs * fix path to class * create separate page for attnprocessors * fix path * fix path for real * fill in docstrings * apply feedback * apply feedback
-