• Lei Wang's avatar
    [Example] Specify a fixed commit for the flash-linear-attention repository and... · 3ad6202d
    Lei Wang authored
    [Example] Specify a fixed commit for the flash-linear-attention repository and optimize nsa examples (#913)
    
    - Updated the requirements.txt to specify a fixed commit for the flash-linear-attention repository.
    - Refactored import paths in benchmark_nsa_fwd.py for better organization.
    - Added a new function to generate configurations for autotuning.
    - Modified the tilelang_sparse_attention function to accept parameters for block size, number of stages, and threads, enhancing flexibility.
    - Changed allocation of shared memory for accumulators to optimize performance.
    3ad6202d
.gitignore 928 Bytes