Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
78225c5366dd4c1c743d9e2ff9d9b6e1ffcb03e7
Switch branch/tag
flash-attention
flash_attn
utils
24 Dec, 2022
1 commit
Implement TensorParallel for FusedDense and FusedDenseGeluDense
· 226a1b72
Tri Dao
authored
Dec 23, 2022
226a1b72
18 Nov, 2022
1 commit
Add __init__.py files to subdirectories for installation
· ece539ab
Tri Dao
authored
Nov 17, 2022
ece539ab
23 Oct, 2022
1 commit
Move benchmark utils, support AMP
· fb88e5e4
Tri Dao
authored
Oct 23, 2022
fb88e5e4