NPU attention refactor for FLUX (#12209)
* NPU attention refactor for FLUX transformer * Apply style fixes --------- Co-authored-by:J石页 <jiangshuo9@h-partners.com> Co-authored-by:
Aryan <aryan@huggingface.co> Co-authored-by:
github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Showing
Please register or sign in to comment