1. 23 Jul, 2024 5 commits
    • Tri Dao's avatar
      65f723bb
    • Tri Dao's avatar
      Clean up softcapping bwd a bit · 5ca83a9c
      Tri Dao authored
      5ca83a9c
    • Tri Dao's avatar
      751c762c
    • Driss Guessous's avatar
      Fix ima for split-kv kernel (#1085) · 1c275eb0
      Driss Guessous authored
      1c275eb0
    • rocking's avatar
      Support AMD ROCm on FlashAttention 2 (#1010) · d8f104e9
      rocking authored
      
      
      * Support ck in fmha
      
      * Add ck submodule
      
      * Do not return lse if return_softmax == false
      
      * Use receipt to speed up ck compile time
      
      * Integrate new version of ck_tile
      
      * Support dropout for mha_fwd()
      
      * Add dropout to mha_varlen_fwd()
      
      * Update ck to develop
      
      * Extract padding function for dropout randval
      
      * Extract randval transformation function
      
      * Sync the code structure and coding style with FA
      
      * Remove this line, c++ api will handle this.
      Sync with test_flash_attn.py
      
      * fix compile error
      
      * Add mha_bwd
      
      * Generate dropout seed and offset from user generator
      
      * update CK
      
      * Add mha_varlen_bwd
      
      * Use same python as build flash-attn to generate ck kernel
      
      * Fix bug of group mode fwd about returning softmax lse
      
      * larger the test tollerance
      
      * Add test_flash_attn_output() and test_flash_attn_varlen_output()
      
      * Always fill softmax_lse
      
      * Remove duplicate benchmark script, since we already implement mha_bwd
      
      * Refine get value from tuple
      
      * Use default parameter for stream_config
      
      * unblock all platform
      
      * Add comment
      
      * refine the test code
      
      * Refine naming
      
      * Add unpack to namespace
      
      * Do not hardcode the warp size 64
      
      * Add more targets
      
      * Add README
      
      * Optimize mha_fwd if seqlen_q == 1
      
      * Support get_wheel_url for rocm
      
      * Detect rocm environment by pytorch's IS_HIP_EXTENSION
      
      * update to lastest ck
      
      * Add necessary compile flag
      
      * Sync the api with upstream FA
      
      ---------
      Co-authored-by: default avatarcarlushuang <carlus.huang@amd.com>
      Co-authored-by: default avatarYichen Yan <wenji.yyc@alibaba-inc.com>
      Co-authored-by: default avatarPo Yen Chen <PoYen.Chen@amd.com>
      Co-authored-by: default avatarYichen Yan <oraluben@outlook.com>
      d8f104e9
  2. 22 Jul, 2024 2 commits
    • Phil Wang's avatar
      backwards for softcapping (#1033) · 5f1ae4a3
      Phil Wang authored
      * check in the two ways of approaching backwards for softcapping, both functional
      
      * prepare the softcap switch for backwards
      
      * temporary
      
      * cleanup to the way Tri prefers
      
      * calculate dtanh when copying from scores -> dtanh Tensor
      
      * no ternary operators allowed for constexpr, so just use some hack found online
      
      * fix maybe_dtanh, restore some files
      
      * restore another file
      
      * move calculate_dtanh to utils and colocate with apply_softcap
      
      * cleanup
      
      * maybe last cleanup
      
      * save for another pr
      
      * remove a stray line
      
      * fix spacing
      
      * fix an issue, and make test_flash_attn.py ready to test softcapping backwards
      5f1ae4a3
    • Jorge António's avatar
      catch typo (#1058) · 4df62e14
      Jorge António authored
      4df62e14
  3. 15 Jul, 2024 1 commit
  4. 11 Jul, 2024 2 commits
  5. 10 Jul, 2024 3 commits
  6. 08 Jul, 2024 1 commit
  7. 01 Jul, 2024 3 commits
  8. 27 Jun, 2024 1 commit
  9. 26 May, 2024 1 commit
  10. 08 Apr, 2024 2 commits
  11. 28 Mar, 2024 2 commits
  12. 15 Mar, 2024 2 commits
  13. 14 Mar, 2024 1 commit
  14. 21 Feb, 2024 2 commits
  15. 20 Feb, 2024 1 commit
  16. 10 Feb, 2024 2 commits
  17. 08 Feb, 2024 1 commit
  18. 30 Jan, 2024 1 commit
  19. 23 Jan, 2024 2 commits
  20. 22 Jan, 2024 1 commit
  21. 21 Jan, 2024 4 commits