Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
7fcd3e6a04fa6810cf6f87310d89955f01f9b786
Switch branch/tag
flash-attention
tests
models
test_bert.py
18 Jan, 2023
1 commit
[FusedDense] Support relu, rename FusedDenseGeluDense -> FusedMLP
· 88173a1a
Tri Dao
authored
Jan 17, 2023
88173a1a
27 Dec, 2022
1 commit
Tweak CrossEntropyLoss to take process_group in init
· c6ecd40a
Tri Dao
authored
Dec 27, 2022
c6ecd40a
20 Dec, 2022
1 commit
Implement last_layer_subset optimization for BERT
· 13cdceb3
Tri Dao
authored
Dec 19, 2022
13cdceb3
19 Dec, 2022
1 commit
Implement BERT
· 5fb6df0e
Tri Dao
authored
Dec 18, 2022
5fb6df0e