Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
8a733cbd538ef59aacfd60bc44a5262dbe8f5768
Switch branch/tag
flash-attention
flash_attn
ops
triton
rotary.py
11 Sep, 2023
1 commit
[Gen] Fix calling update_graph_cache in tests
· 8a733cbd
Tri Dao
authored
Sep 10, 2023
8a733cbd
06 Sep, 2023
1 commit
[Rotary] Set device before launching Triton kernel to avoid error
· 97951590
Tri Dao
authored
Sep 05, 2023
97951590
04 Sep, 2023
1 commit
[Rotary] Implement varlen rotary
· b28ec236
Tri Dao
authored
Sep 03, 2023
b28ec236
03 Sep, 2023
3 commits
[Rotary] Clean up rotary Triton implementation a bit
· 861c8257
Tri Dao
authored
Sep 03, 2023
861c8257
[Rotary] Speed up rotary kernel when interleaved=True
· 1c523c1c
Tri Dao
authored
Sep 03, 2023
1c523c1c
[Rotary] Implement rotary in Triton
· 942fcbf0
Tri Dao
authored
Sep 03, 2023
942fcbf0