Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
853ff72963e73456c2a318bde1ebfa292ce935e9
Switch branch/tag
flash-attention
training
Dockerfile
12 Apr, 2023
2 commits
Bump version to v1.0.1, fix Cutlass version
· 853ff729
Tri Dao
authored
Apr 12, 2023
853ff729
Bump version to 1.0.0
· 74af0233
Tri Dao
authored
Apr 11, 2023
74af0233
19 Jan, 2023
1 commit
Bump to v0.2.8
· 33e0860c
Tri Dao
authored
Jan 19, 2023
33e0860c
07 Jan, 2023
1 commit
Bump to v0.2.7
· ce26d3d7
Tri Dao
authored
Jan 06, 2023
ce26d3d7
30 Dec, 2022
1 commit
[Docker] Set torchmetrics==0.10.3
· cadfa396
Tri Dao
authored
Dec 30, 2022
cadfa396
29 Dec, 2022
1 commit
Update training Dockerfile to use flash-attn==0.2.6
· 984d5204
Tri Dao
authored
Dec 29, 2022
984d5204
29 Nov, 2022
2 commits
Update configs, add results
· 4a6eaa9f
Tri Dao
authored
Nov 29, 2022
4a6eaa9f
Release training code
· 0bf5e500
Tri Dao
authored
Nov 28, 2022
0bf5e500