Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
e68ebbe89a9005f3919e26eb9c3c4150c8829047
Switch branch/tag
flash-attention
training
21 Dec, 2022
1 commit
Fix typo in config: train.gpu -> train.gpu_mem
· c2407dec
Tri Dao
authored
Dec 21, 2022
c2407dec
29 Nov, 2022
2 commits
Update configs, add results
· 4a6eaa9f
Tri Dao
authored
Nov 29, 2022
4a6eaa9f
Release training code
· 0bf5e500
Tri Dao
authored
Nov 28, 2022
0bf5e500