Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
88173a1aafe8e2796fcfc9433160838138f0dfc1
Switch branch/tag
flash-attention
training
README.md
18 Jan, 2023
1 commit
[FusedDense] Support relu, rename FusedDenseGeluDense -> FusedMLP
· 88173a1a
Tri Dao
authored
Jan 17, 2023
88173a1a
30 Dec, 2022
2 commits
[Docs] Fix formatting
· 43798966
Tri Dao
authored
Dec 30, 2022
43798966
[Docs] Mention that dropout_layer_norm supports all dims up to 6k
· 3c7cbfc1
Tri Dao
authored
Dec 29, 2022
3c7cbfc1
29 Nov, 2022
2 commits
Update configs, add results
· 4a6eaa9f
Tri Dao
authored
Nov 29, 2022
4a6eaa9f
Release training code
· 0bf5e500
Tri Dao
authored
Nov 28, 2022
0bf5e500