Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
cbb4cf5f4654c8be42ce086f8528ccbb5a786458
Switch branch/tag
flash-attention
flash_attn
ops
triton
19 Apr, 2023
1 commit
Implement LLaMa
· 96d10f65
Tri Dao
authored
Apr 18, 2023
96d10f65
13 Apr, 2023
1 commit
[FusedDense] Enable sqrelu activation in FusedMLP
· 6f6e9a9a
Tri Dao
authored
Apr 13, 2023
6f6e9a9a
14 Nov, 2022
1 commit
Add GPT and ViT models
· 2e33fc8e
Tri Dao
authored
Nov 13, 2022
2e33fc8e