[fastnn] add triton flash attention (#109)
* add triton flash attention * add fallfack for flash attention * add pytest skip mask for attention kernel
Showing
Please register or sign in to comment
* add triton flash attention * add fallfack for flash attention * add pytest skip mask for attention kernel