-
李金梁 authored
* Fix bug in torch compile and seqdim is integer Signed-off-by:
李金梁 <975761915@qq.com> * Update attention.py change the jit_fuser to torch.compile on flash_attn_fwd_out_correction Signed-off-by:
李金梁 <975761915@qq.com> * Annotate fused functions Signed-off-by:
Kirthi Shankar Sivamani <ksivamani@nvidia.com> --------- Signed-off-by:
李金梁 <975761915@qq.com> Signed-off-by:
Kirthi Shankar Sivamani <ksivamani@nvidia.com> Co-authored-by:
Kirthi Shankar Sivamani <ksivamani@nvidia.com>
9ee2dbdd