• Sayak Paul's avatar
    let's go Flux2 🚀 (#12711) · 5ffb73d4
    Sayak Paul authored
    
    
    * add vae
    
    * Initial commit for Flux 2 Transformer implementation
    
    * add pipeline part
    
    * small edits to the pipeline and conversion
    
    * update conversion script
    
    * fix
    
    * up up
    
    * finish pipeline
    
    * Remove Flux IP Adapter logic for now
    
    * Remove deprecated 3D id logic
    
    * Remove ControlNet logic for now
    
    * Add link to ViT-22B paper as reference for parallel transformer blocks such as the Flux 2 single stream block
    
    * update pipeline
    
    * Don't use biases for input projs and output AdaNorm
    
    * up
    
    * Remove bias for double stream block text QKV projections
    
    * Add script to convert Flux 2 transformer to diffusers
    
    * make style and make quality
    
    * fix a few things.
    
    * allow sft files to go.
    
    * fix image processor
    
    * fix batch
    
    * style a bit
    
    * Fix some bugs in Flux 2 transformer implementation
    
    * Fix dummy input preparation and fix some test bugs
    
    * fix dtype casting in timestep guidance module.
    
    * resolve conflicts.,
    
    * remove ip adapter stuff.
    
    * Fix Flux 2 transformer consistency test
    
    * Fix bug in Flux2TransformerBlock (double stream block)
    
    * Get remaining Flux 2 transformer tests passing
    
    * make style; make quality; make fix-copies
    
    * remove stuff.
    
    * fix type annotaton.
    
    * remove unneeded stuff from tests
    
    * tests
    
    * up
    
    * up
    
    * add sf support
    
    * Remove unused IP Adapter and ControlNet logic from transformer (#9)
    
    * copied from
    
    * Apply suggestions from code review
    Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
    Co-authored-by: default avatarapolinário <joaopaulo.passos@gmail.com>
    
    * up
    
    * up
    
    * up
    
    * up
    
    * up
    
    * Refactor Flux2Attention into separate classes for double stream and single stream attention
    
    * Add _supports_qkv_fusion to AttentionModuleMixin to allow subclasses to disable QKV fusion
    
    * Have Flux2ParallelSelfAttention inherit from AttentionModuleMixin with _supports_qkv_fusion=False
    
    * Log debug message when calling fuse_projections on a AttentionModuleMixin subclass that does not support QKV fusion
    
    * Address review comments
    
    * Update src/diffusers/pipelines/flux2/pipeline_flux2.py
    Co-authored-by: default avatarYiYi Xu <yixu310@gmail.com>
    
    * up
    
    * Remove maybe_allow_in_graph decorators for Flux 2 transformer blocks (#12)
    
    * up
    
    * support ostris loras. (#13)
    
    * up
    
    * update schdule
    
    * up
    
    * up (#17)
    
    * add training scripts (#16)
    
    * add training scripts
    Co-authored-by: default avatarLinoy Tsaban <linoytsaban@gmail.com>
    
    * model cpu offload in validation.
    
    * add flux.2 readme
    
    * add img2img and tests
    
    * cpu offload in log validation
    
    * Apply suggestions from code review
    
    * fix
    
    * up
    
    * fixes
    
    * remove i2i training tests for now.
    
    ---------
    Co-authored-by: default avatarLinoy Tsaban <linoytsaban@gmail.com>
    Co-authored-by: default avatarlinoytsaban <linoy@huggingface.co>
    
    * up
    
    ---------
    Co-authored-by: default avataryiyixuxu <yixu310@gmail.com>
    Co-authored-by: default avatarDaniel Gu <dgu8957@gmail.com>
    Co-authored-by: default avataryiyi@huggingface.co <yiyi@ip-10-53-87-203.ec2.internal>
    Co-authored-by: default avatardg845 <58458699+dg845@users.noreply.github.com>
    Co-authored-by: default avatarDhruv Nair <dhruv.nair@gmail.com>
    Co-authored-by: default avatarapolinário <joaopaulo.passos@gmail.com>
    Co-authored-by: default avataryiyi@huggingface.co <yiyi@ip-26-0-160-103.ec2.internal>
    Co-authored-by: default avatarLinoy Tsaban <linoytsaban@gmail.com>
    Co-authored-by: default avatarlinoytsaban <linoy@huggingface.co>
    5ffb73d4
peft.py 38.1 KB