"...git@developer.sourcefind.cn:cnjsdfcy/simbricks.git" did not exist on "3bdd6ef39fe1a68277e0f496c75abb2d96b1d0df"
[Examples] Implement NSA Backward kernels (#180)
* Update native sparse attention example with scale parameter handling - Add scale parameter processing in native_sparse_attention function - Modify example script to include custom scale value - Update function calls to pass scale parameter - Enhance flexibility of sparse attention implementation * Refactor Triton Native Sparse Attention Example - Improve code formatting and readability in example_triton_nsa_bwd.py - Standardize function and parameter alignment - Remove unnecessary whitespaces and optimize imports - Enhance code style consistency with previous commits
Showing
This diff is collapsed.
This diff is collapsed.
Please register or sign in to comment