[`core` / `attention`] Fix fused attention generation with newest transformers version (#146)
Co-authored-by:
Casper <casperbh.96@gmail.com>
Showing
Please register or sign in to comment
Co-authored-by:
Casper <casperbh.96@gmail.com>