[Example] Specify a fixed commit for the flash-linear-attention repository and...
[Example] Specify a fixed commit for the flash-linear-attention repository and optimize nsa examples (#913) - Updated the requirements.txt to specify a fixed commit for the flash-linear-attention repository. - Refactored import paths in benchmark_nsa_fwd.py for better organization. - Added a new function to generate configurations for autotuning. - Modified the tilelang_sparse_attention function to accept parameters for block size, number of stages, and threads, enhancing flexibility. - Changed allocation of shared memory for accumulators to optimize performance.
Showing
Please register or sign in to comment