Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
b784ed73cf89cc1c820d18a4bf6f232f91004b47
Switch branch/tag
flash-attention
usage.md
25 Nov, 2022
1 commit
[Docs] Clarify OpenFold speedup
· b784ed73
Tri Dao
authored
Nov 25, 2022
b784ed73
23 Nov, 2022
1 commit
[Docs] Mention OpenFold
· d9021ae4
Tri Dao
authored
Nov 23, 2022
d9021ae4
14 Nov, 2022
4 commits
Mention DeepSpeed inference in usage.md
· b0ed0a73
Tri Dao
authored
Nov 14, 2022
b0ed0a73
Mention AITemplate Stable Diffusion in usage.md
· 25387b24
Tri Dao
authored
Nov 14, 2022
25387b24
Link to Colossal-AI's stable diffusion in usage.md
· b92f2c3b
Tri Dao
authored
Nov 13, 2022
b92f2c3b
Add a page on where FlashAttention is being used
· 79160a69
Tri Dao
authored
Nov 13, 2022
79160a69